-
Notifications
You must be signed in to change notification settings - Fork 118
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FLINK-32453] Make main and v3.0 branch build against Flink 1.18-SNAPSHOT #39
Conversation
… change in Flink 1.18 This is a temporary workaround for a breaking change that occured on the TypeSerializerUpgradeTestBase due to FLINK-27518. A proper fix would be to properly introduce a public-facing test utility to replace TypeSerializerUpgradeTestBase, and move the Kafka connector test code to use that instead.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm wondering how we should handle that in the future. So far, only Kafka connector utilized this test base. But if we, for instance, externalize the formats as well, we would run into the same issue with Avro. And in general, it might be a framework for other connectors to use in the future.
Would it make sense to create a new testbase class for connectors in flink-connector-common
that implements the test data generation and mark it as @PublicEvolving? To make it more obvious that it's used in the external repositories?
Additionally, where is the release documentation kept for the Kafka connector. Having this test in the external repo would mean that the release manager would have to generate the test data manually. But I couldn't find any docs on that one. 🤔
re: @XComp
I was thinking along the same lines, although it should just be a public-facing test utility that Flink distributes. I can imagine any user implementing implementing their own
There's this: https://cwiki.apache.org/confluence/display/FLINK/Creating+a+flink-connector+release |
Thanks. But that's the generic documentation. I was looking for Kafka connector-specific documentation where we would put the data generation step. Maybe, adding it to the readme for now is sufficient enough? And we would have to create a follow-up ticket that moves the generic test class into |
@XComp there isn't a Kafka-connector specific release documentation, as far as I'm aware of. So far, the generic doc was good enough at least for the first |
I created FLINK-32582 as a follow-up ticket. |
This PR contains two commits that allows the Kafka connector to be built against Flink
1.18-SNAPSHOT
.TypeSerializerUpgradeTestBase
due to a breaking change introduced in FLINK-27518 across Flink1.17.x
and1.18.x
.For the former, the long-term proper fix would be to properly introduce a public-facing replacement test utility for the {{TypeSerializerUpgradeTestBase}}, and have the Flink Kafka connector test code gradually move to that instead.