-
Notifications
You must be signed in to change notification settings - Fork 53
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Native support for multi-module Maven projects #510
Conversation
List<PluginMetadata> pluginMetadataList = WarUtils.extractPluginMetadataFromWar( | ||
config.getWar(), | ||
PluginMetadataHooks.loadExtractors(config.getExternalHooksJars()), | ||
config.getIncludePlugins(), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we pass the included and excluded here rather than obtain the list and then filter not for performance (I think the code is cleaner the other way around and has better separation of concerns) but because the war can contain things that are doomed for all eternity (like gradle
plugin and ace-editor
).
The gradle plugin could be updated to also include new metadata and some dead plugins could be removed in time - but filtering before obtaining metadata means we can get rid of "legacy" extractors much quicker, and a failure to extract metadata for a plugin can be considered fatal and handled early.
} catch (PluginCompatibilityTesterException e) { | ||
if (lastException != null) { | ||
e.addSuppressed(lastException); | ||
cloneDir = new File(config.getWorkingDir(), getRepoNameFromGitURL(gitUrl)); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the directory will be based on the repo name not the plugin name.
if consumers are expecting test results in $workDir/pluginName
then they need to adapt (probably easiest to use just workDir
and ephemeral agents so that there will only be one checkout and run
if (lastException != null) { | ||
e.addSuppressed(lastException); | ||
cloneDir = new File(config.getWorkingDir(), getRepoNameFromGitURL(gitUrl)); | ||
// all plugins from the same reactor are assumed to be of the same version |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if this is an issue we can either defend against it, or we can include the hash in the map key.
given how the BOM and CloudBees's proprietery builds work I do not expect this to be an issue but happy to enhance as required,
|
||
@Override | ||
public PluginMetadata apply(PluginMetadata pluginMetadata) throws WrappedPluginCompatabilityException { | ||
BeforeCheckoutContext c = new BeforeCheckoutContext(pluginMetadata, coreVersion, config); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this keeps the idiom that the BeforeCheckoutHook
is run before checkout and for every plugin (even when we only checkout once for a reactor).
I'm not sure what a Hook could reliably do here - the tag hook is something that does say we still want a hook per plugin and not per repository. this leaves the status quo
justification = | ||
"spotbugs false poisitive due to try-with-resources / https://github.com/spotbugs/spotbugs/issues/2191") | ||
private static String getPCTVersionString() { | ||
try (InputStream manifestStream = VersionProvider.class.getResourceAsStream("/META-INF/MANIFEST.MF")) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
uses the metadata we now build into the jar. unrelated to the multi-module work - but was here as it was in all the builds I had for various testing. Should have been puled out earlier.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should have been puled out earlier.
Why wasn't it?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I was juggling with too many workspaces, no real reason - I Can still extract it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK, that places a large burden on others who are trying to make sense of such a large change. As a courtesy I already extracted this, so there is nothing left to be done.
Please do not retain any of my sleep-deprived late-night commit messages (notes to self, not intended for the historical record) by squash merging this, retaining only the title and Co-authored-by. |
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
an external extractor
There should not be any external extractors, and the presence of an external extractor would be technical debt that would prevent us from simplifying this logic in the future.
perhaps you are forgetting about the multi-module proprietary plugins? e.g. we have an extractor similar in nature to Or are you suggesting we sit on this code until all required plugins have been updated? |
That could be OK. Can you share a link to what you have in mind? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this should be fine.
if (plugins.isEmpty()) { | ||
throw new MetadataExtractionException("Found no plugins in " + warFile); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this breaks the ability to run a PCT against a plugin that is in a local clone only. this was previously working,
In the multimodule branch there was a check to ensure that all plugins from a given repository use the same tag, but this was not carried over in #510 this changes the assumption that the input is correct and could silently provide invorrect results, to an explicit check that fails early before starting to check compatability. Only plugins that are being checked for compatability will be involved in the repository. This is to allows a war to contain a plugin that was formerly in the repository that has been changed to just migrate data to a new location (by adding a dependency on a replacement plugin) to still be in the war, even after it has been deleted from the repository.
This changes the way the PCT clones from per plugin to per repository.
Once the repository is cloned it then exercises each plugin in the repo in turn (serially).
plugins that are excluded (or implicitly if an include list is provided) will not be tested even if some other plugin in the same repository is tested. It is assumed that all plugins in the same repository will use the same tag (however this is not explicitly prevented - but can be if deemed necessary)
The pct command is no longer a root command but is now a sub command
test-plugins
and any caller of the PCT will need to be adjusted to use this sub command.There is a new command
list-plugins
that will list all plugins in the war (excluding detached) that can also filter by includes or excludes.Tooling can call this in order to group plugin tests for the same repository. (as above this does not group by repository and tag - it is assumed that the version of plugins from the same repository will be the same - this can be added if required, or we can group by repository and commit)
All
AbstractMultiParentHook
have been removed and similar to thegit url
workaround has been inlined into code that should become dead once all plugins have been updated to include jenkinsci/maven-hpi-plugin#436The support is currently via a hook like
MetadataExtractor
- all implkementations of this are expected to be removed when plugins are migrated, simplifying the implementation.build/testing unconditionally sets
set.changelist
so that incremental and CD versions have the correct version and can obtain parents or other reactor versions from the repository. This is considered harmless for non incremental/CD versions.MavenRunner
now takes a module name which will be used as-pl moduleName
if non null.--local-checkout-dir
is supported for via jenkinsci/maven-hpi-plugin#463This has been tested with a build against cloudbees as well as local builds with local checkouts of various plugins including
warnings-ng
andaws-java-sdk
aws-java-sdk
does not have aMetadataExtractor
and i fit is in the war is required to be version1.12.406-373.v59d2b_d41281b_
from jenkinsci/aws-java-sdk-plugin#956the
maven:hpi
for--local-checkout-dir
is also an incremental version which requires incremental support to be enabled when testing.The code may or may not support timestamped snapshots of plugins in the war. without this change they are not supported as thetag
will be rejected due to it beinghead
. Post this change and with jenkinsci/maven-hpi-plugin#436 timestamped snapshots in the war may work (but it has not been tested).The version of the core is now directly obtained from the war's Manifest, if the manifest is being discarded as part of building an uber-war then consumers will need to adapt the building of the war accordingly - an example is shown in the pom where we build our war for testing.