Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

update of macro for postgres/redshift use of unique_key as a list #4858

Merged
merged 9 commits into from
Mar 22, 2022

Conversation

McKnight-42
Copy link
Contributor

@McKnight-42 McKnight-42 commented Mar 11, 2022

#4738 update of macro macro get_delete_insert_merge_sql to check if unique key being passed is a list or string

Description

updates

{% if unique_key is not none %}
    delete from {{ target }}
    where ({{ unique_key }}) in (
        select ({{ unique_key }})
        from {{ source }}
    );
    {% endif %}

to

 {% if unique_key is not none and unique_key != [] %}
        {% if unique_key is sequence and unique_key is not mapping and unique_key is not string %}
            delete from {{target }}
            using {{ source }}
            where (
                {% for key in unique_key %}
                    {{ source }}.{{ key }} = {{ target }}.{{ key }}
                    {{ "and " if not loop.last }}
                {% endfor %}
            );
        {% else %}
            delete from {{ target }}
            where (
                {{ unique_key }}) in (
                select ({{ unique_key }})
                from {{ source }}
            );

        {% endif %}
        {% endif %}

To pair with changes made in #4618 to extend capabilities to postgres and redshift adapters

Checklist

  • I have signed the CLA
  • I have run this code in development and it appears to resolve the stated issue
  • This PR includes tests, or tests are not required/relevant for this PR
  • I have added information about my change to be included in the CHANGELOG.

@github-actions
Copy link
Contributor

Thank you for your pull request! We could not find a changelog entry for this change. For details on how to document a change, see the contributing guide.

@McKnight-42 McKnight-42 self-assigned this Mar 11, 2022
@cla-bot cla-bot bot added the cla:yes label Mar 14, 2022
@McKnight-42 McKnight-42 marked this pull request as ready for review March 14, 2022 19:33
@McKnight-42 McKnight-42 requested a review from a team March 14, 2022 19:33
@McKnight-42 McKnight-42 requested review from a team as code owners March 14, 2022 19:33
Copy link
Contributor

@jtcohen6 jtcohen6 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@McKnight-42 and I had a chance to pair this afternoon, and made good progress!

  • Our "standard" tests (copied from plugins) are passing on Postgres here
  • The exact same tests pass against Redshift in local testing
  • And against Snowflake, using the delete+insert incremental strategy

Next steps look like:

  • Rewriting these tests to use the new pytest framework
  • Make them truly cross-database compatible (I think by turning expected models into seeds)
  • Add them to the inheritable adapter tests, and turn them on for Redshift + Snowflake + BigQuery + Spark. Delete the copy-pasted versions present in those repos :)

I'd really like to include this change ahead of cutting v1.1.0-b1 on Thursday, and I don't think those next steps need to block merging this PR. If we do merge with next steps still open, we should open issues for the remaining work.

from {{ source }}
);
{% endif %}
{% if unique_key %}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was worried that this may be a breaking change for any users who have defined a unique key named "False". Putting aside how confusing that would be — we tested it out, and this logic should still work.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

also an empty string

)

self.assertEqual(status, RunStatus.Error)
self.assertTrue("thisisnotacolumn" in exc.lower())
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This logic needs to be database-agnostic. The dbt-snowflake tests expect a highly specific error message, which is specific to Snowflake. The test xfails correctly on other databases, but then those databases use their own words to tell us that the column is missing. No good.

The move here is to:

  • Only check for the missing column name (thisisnotacolumn) in the error message, since any database worth its salt should include the column name in its error message
  • Lowercase the whole error message, since some databases (Snowflake) will return the column name in uppercase

Turns out, this is also what we need to get the tests passing on Snowflake with delete+insert incremental strategy, since the error message is slightly different (different column alias)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have opened up a new issue to continue the conversation here #4926

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @McKnight-42! I should been clearer in my statement above: The logic in this PR is database-agnostic. It isn't in some of the other plugins (since we copy-pasted-edited the test cases). We'll want to sort that out when we go to solve #4882

Copy link
Member

@emmyoop emmyoop left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One small question on your changes to the macro.

But I think the bigger thing is the new tests you added should be added in the new pattern of testing instead of the old integration tests pattern.

@emmyoop
Copy link
Member

emmyoop commented Mar 16, 2022

@McKnight-42 I just saw @jtcohen6's comment on converting the tests not being a blocker to getting this into the beta. That's fine but can you create an issue now to resolve those last steps so it's not lost?

Next steps look like:

  • Rewriting these tests to use the new pytest framework
  • Make them truly cross-database compatible (I think by turning expected models into seeds)
  • Add them to the inheritable adapter tests, and turn them on for Redshift + Snowflake + BigQuery + Spark. Delete the copy-pasted versions present in those repos :)

@McKnight-42
Copy link
Contributor Author

@emmyoop new issue created #4882

Copy link
Contributor

@VersusFacit VersusFacit left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Minor suggestions for readability but the logic looks solid.

Copy link
Member

@emmyoop emmyoop left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good now!

Copy link
Contributor

@iknox-fa iknox-fa left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This LGTM with the exception of the cross-database concerns @jtcohen6 pointed out.

I wouldn't be opposed to merge as is for now since we still only run integration tests with postgres as long as we have a follow-on ticket to make it agnostic.

@leahwicz leahwicz dismissed VersusFacit’s stale review March 22, 2022 15:21

The feedback has been addressed so unblocking this PR

Copy link
Contributor

@gshank gshank left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants