Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add design document for issuing a warning when targeting netstandard1.x #317

Merged
merged 4 commits into from
Jun 1, 2024

Conversation

terrajobst
Copy link
Member

@terrajobst terrajobst commented May 15, 2024

We'd like to issue a warning when building a project targeting .NET Standard 1.x and recommend the user to upgrade to .NET Standard 2.0.

The reason being that some developers still target .NET Standard 1.x even though it doesn't buy them any additional reach as all in supported flavors of .NET all support .NET Standard 2.0 or later.

The goal of this warning is allow people to set their projects more up for success; that is, have good reach while also being able to have a sizeable feature set (both in terms of framework APIs as well as NuGet packages they can reference).

Please note that this will not issue a warning for .NET Standard 2.0/2.1, nor are we changing anything around versions of .NET Standard can be built or referenced.

@Varorbc
Copy link

Varorbc commented May 15, 2024

Can also issue a warning when the target framework is 2.0/2.1? Because even 2.0/2.1 has many additional apis that are not available, a multi target framework is recommended.

Just as SqlClient has removed support for netstandard 2.0/2.1, I believe there will be more library removal support in the future.

Co-authored-by: Austin Wise <AustinWise@gmail.com>
Co-authored-by: Weihan Li <weihanli@outlook.com>
@terrajobst
Copy link
Member Author

terrajobst commented May 16, 2024

@Varorbc

Can also issue a warning when the target framework is 2.0/2.1? Because even 2.0/2.1 has many additional apis that are not available, a multi target framework is recommended.

Just as SqlClient has removed support for netstandard 2.0/2.1, I believe there will be more library removal support in the future.

Do you mean .NET Standard 2.0/2.1? Then the answer would be no, we don't want to do this yet as .NET Framework is still supported and .NET Standard 2.0 is the only way to build a single binary that can be consumed by both .NET Framework and .NET Core.

If you mean .NET Core 2.x, we already issue a warning when using frameworks that are out of support, like so:

warning NETSDK1138: The target framework 'netcoreapp2.0' is out of support and will not receive security updates in the future. Please refer to https://aka.ms/dotnet-core-support for more information about the support policy.

@WeihanLi
Copy link
Contributor

@terrajobst speaking of the dotnet framework support, I have a question, to support the dotnet framework for a new package, do we recommend using the netstandard2.0 target or a framework target like net462?

@Varorbc
Copy link

Varorbc commented May 16, 2024

Do you mean .NET Standard 2.0/2.1? Then the answer would be no, we don't want to do this yet as .NET Framework is still supported and .NET Standard 2.0 is the only way to build a single binary that can be consumed by both .NET Framework and .NET Core.

@terrajobst According to the documentation,. Net standard version 1.4 to 2.0, the corresponding. Net framework version is 4.6.1, and. Net framework 4.6.1 is no longer supported. If netstandard 1.x warns,netstandard 2.0 should also issue a warning.

@terrajobst
Copy link
Member Author

@terrajobst speaking of the dotnet framework support, I have a question, to support the dotnet framework for a new package, do we recommend using the netstandard2.0 target or a framework target like net462?

I generally recommend .NET Standard 2.0 for code that you want to work everywhere. If you don't use <TargetFramework>netstandard2.0</TargetFramework> and instead use multi-targeting (say <TargetFrameworks>net462;net6.0</TargetFrameworks>) then all your consumers also need to multi-target, which is extremely annoying and prone to errors.

@dotMorten
Copy link

I don’t really see the point. If I don’t need anything netstd2 offers me there’s no harm in targeting an older version. It’ll still work with newer versions. A warning seems to indicate I’m doing something wrong/bad but I can’t think of a single bad thing here (apart from of course the smaller api surface which is expected from using an older version).

The goal of this warning is allow people to set their projects more up for success; that is, have good reach

I don’t get how increasing my minimum will create better reach? It’ll only create less (many people are still stuck on older no longer supported targets that I might want to support even if Microsoft don’t). This doesn’t set me up for greater success - only more warnings and potentially just doing it even though I don’t need the extra APIs and not realizing I cut a user off who relied on older versions

Co-authored-by: Yaakov <yaakov-h@users.noreply.github.com>
@JamesNK
Copy link
Member

JamesNK commented May 16, 2024

The problem with netstandard1.x if you need to bring in a bunch of nuget packages. e.g. https://github.com/JamesNK/Newtonsoft.Json/blob/2eaa475f8853f2b01ac5591421dcabd7a44f79ce/Src/Newtonsoft.Json/Newtonsoft.Json.csproj#L73-L90

It's fiddly, and annoying. Meanwhile netstandard2.0 just works. Yes, you get more reach with 1.x, but it's a pain to create a package that has 1.x targets, and netstandard2.0 gets you the targets you want anyway: modern net4.x and modern net5+.

I don’t get how increasing my minimum will create better reach? It’ll only create less (many people are still stuck on older no longer supported targets that I might want to support even if Microsoft don’t).

Who are you trying to help by having netstandard1.x? I've said many times in the past that I wish NuGet gave states about which target in a NuGet package was being used. Unfortunately we don't have that. What I can say anecdotally is that when I dropped portable class library targets (remember those?) from Newtonsoft.Json, not a single person complained.

My theory is if a person has an app that only supports 1.x, then it's way out of support. It's not like people are choosing to upgrade NuGet packages to the latest versions in these apps anyway.

@dotMorten
Copy link

The problem with netstandard1.x if you need to bring in a bunch of nuget packages.

Only if you need those APIs. That's one reason why you'd might want to move up to netstandard2.0, but my argument was: If I don't need the extended API surface, it seems more prudent to only target as low a version as I need. Giving me a warning because I don't need the full netstandard2.0 API surface seem silly.

it's a pain to create a package that has 1.x targets

I don't get this. There's no practical difference in creating packages targeting 1.x vs 2.0.

@JamesNK
Copy link
Member

JamesNK commented May 16, 2024

You need to figure out the right packages you need for the APIs you want to reference. It's not extremely hard, just annoying. IMO the juice isn't worth the squeeze. Another negative is netstandard1.x creates a larger NuGet dependency graph. They're slower to build (someone correct me if I'm wrong here)

Remember that this would just be a warning saying that you - a developer creating a library NuGet package - shouldn't need to do this because 1.x only platforms are out of support. If you want to keep making netstandard1.x packages, then you can ignore the warning.

I see this as a positive step to move the ecosystem forward by removing no longer necessary history.

@dotMorten
Copy link

dotMorten commented May 16, 2024

Are you officially deprecating netstandard 1.x ?
Since it's just a minimum list of APIs required for someone to implement to be vX.Y compliant, it just feels odd to be deprecating that list, because the currently known implementers of the various Microsoft .NET runtimes have all moved on.
How does this affect other .NET implementations, like Mono, Nano Framework, Micro Framework, Unity3D etc?

Bottom line is I don't think it should be a warning if this isn't something that is actually officially deprecated. All the current .NET implementations all support .NET Standard 1.x libraries, since they are merely a subset of 2.0.

@JamesNK
Copy link
Member

JamesNK commented May 16, 2024

I'm not doing anything other than providing my 2 cents to Immo's proposal.

Immo can say for sure, but it sounds like this is just adding a compiler warning and can probably be suppressed.

@MichalStrehovsky
Copy link
Member

If I don’t need anything netstd2 offers me there’s no harm in targeting an older version

Just to give a concrete example of real harm when targeting older version, my favorite "project is targeting NetStandard < 2" bug:

There are no guarantees around referential equality of MemberInfos (except for runtime provided Type). NetStandard < 2 doesn't have operator== on MemberInfo, so any comparison is referential comparison. Because of this, comparisons involving MemberInfo will often end up not equal on UWP (or Native AOT too). It's a real mystery bug unless you know to expect it. Recompiling for netstandard2.0 fixes those issues.

There are probably other examples where better overloads of various methods became available in netstandard2.0. Compiling for netstandard2.0 actually produces a different assembly, not just a different version of assembly references.

@Perksey
Copy link
Member

Perksey commented May 16, 2024

Ultimately .NET Standard isn’t a framework unto itself, it’s an API contract. I agree with @dotMorten’s point. Consider cases like OpenGL, which is also an API contract with many implementations, generally you only target the minimum version for the functionality you actually need. It is true that most people target NS 2.0 by default because generally this is the path of least resistance for the users looking to build for all .NET implementations, but this needn’t detract from the contract vs framework argument.

@MichalStrehovsky
Copy link
Member

Consider cases like OpenGL, which is also an API contract with many implementations, generally you only target the minimum version for the functionality you actually need

OpenGL is a C API and C doesn't have function overloading. Merely changing the version of the API contract will not change what method is called, it just makes a different set of methods available. In .NET, changing the version of the API contract can and will change what method is called due to overloading. Usually the compiler will select a better/more specialized version of the method. Targeting a lower version of the contract when it's not actually necessary to go that low produces a worse assembly not just because there are fewer APIs to call consciously, but also fewer APIs for the compiler to bind to. It can even lead to bugs that got fixed later.

@huoyaoyuan
Copy link
Member

According to the documentation,. Net standard version 1.4 to 2.0, the corresponding. Net framework version is 4.6.1, and. Net framework 4.6.1 is no longer supported.

@Varorbc A correction for this: the .NET Standard versions do not correspond to .NET Framework 4.6.1, they are supported by .NET Framework 4.6.1 and above. The must-to-support is the highest .NET Standard version that is supported by lowest supported .NET Framework version. In this case, the highest supported .NET Standard version for .NET Framework 4.6.2-4.8.1 is all .NET Standard 2.0.

@LuisApplivity
Copy link

LuisApplivity commented May 16, 2024

Both sides of the argument make valid points: As a STANDARD, it makes no sense to deprecate and/or issue a warning against a subset of an otherwise-encouraged standard (.NET Standard 2.0). So props to @dotMorten and @Perksey on that.

On the other hand, and this is 100% Microsoft's fault for deliberately obfuscating/removing references and explanations in older documentation over time, there are the facts that both @JamesNK and @MichalStrehovsky alluded to, which is that .NET Standard 1.x is NOT purely a subset of .NET Standard 2.0 (and 2.1). The inner workings are different. This is also part of why .NET Standard 2.0 was NOT called .NET Standard 1.7, and the final 1.x version remained at 1.6. There are various, UN-intuitive under-the-hood changes in how they function, in what exactly they support, and so on.

In my personal view, I wholeheartedly agree .NET Standard 2.0 should be favored, for the developer's sake, BUT there is absolutely NO reason working .NET Standard 1.x projects should be nagged with any form of warning, so I ultimately disagree on @terrajobst's proposal. Because all supported versions of .NET implement .NET Standard 1.x, no version of .NET Standard should ever be deprecated, nor have warnings issued against. Calling it a "standard" would become a lie if that happens.

There is one more reason for my disagreement, which is that both .NET Standard 2.0 and 2.1 LACK a few features that are present in .NET Standard 1.5 and 1.6, meaning those working 1.5/1.6 projects have a risk of breaking when moving "up" to 2.0 or 2.1, for absolutely no benefit. Admittedly, these features are very few, and rarely ever used, but are ultimately there. The caveats mentioned by @JamesNK and @MichalStrehovsky will not matter for robust, mature, working projects that need not be touched. That being said, I do agree that, when/if possible, 1.5/1.6 projects should be moved either down to 1.3 (better reach), or up to 2.0 (better almost-everything), to avoid these types of confusion and special caveats.

As a side bonus, .NET Standard 1.3 and lower are implemented by .NET Framework 4.6, which is the latest .NET to run on Windows Vista, even if they are no longer supported. Still worth mentioning. @JamesNK is right, though, in that usually people using such will also not care if a newer Package version no longer supports their peculiar use-case etc. (I miss PCL by the way! You can count me as your first "complaint"! 😉 What if I want to target the Xbox 360 with a reusable class library? It still plays games in 2024!).

@huoyaoyuan
Copy link
Member

which is that .NET Standard 1.x is NOT purely a subset of .NET Standard 2.0 (and 2.1). The inner workings are different. This is also part of why .NET Standard 2.0 was NOT called .NET Standard 1.7, and the final 1.x version remained at 1.6.

This is not true. @MichalStrehovsky 's example is more about how adding stuff can make a change. It is a strict superset. The major change of .NET Standard 2.0 is how things are assembled together.

As a side bonus, .NET Standard 1.3 and lower are implemented by .NET Framework 4.6, which is the latest .NET to run on Windows Vista, even if they are no longer supported.

Worth considering that the SDK doesn't issue a warning when targeting unsupported net46, unlike lower versions of .NET Core.

@LuisApplivity
Copy link

which is that .NET Standard 1.x is NOT purely a subset of .NET Standard 2.0 (and 2.1). The inner workings are different. This is also part of why .NET Standard 2.0 was NOT called .NET Standard 1.7, and the final 1.x version remained at 1.6.

This is not true. @MichalStrehovsky 's example is more about how adding stuff can make a change. It is a strict superset. The major change of .NET Standard 2.0 is how things are assembled together.

Wrong, it is not a strict superset, a very tiny number of APIs are exclusive to 1.5 and 1.6. There was a "rollback" with .NET Standard 1.5, because it originally supported .NET Framework 4.6.2, but NOT 4.6.1, because some of its APIs were not available to 4.6.1. This was later deemed as a mistake by Microsoft, since 4.6.1 was very active at the time, and it was desired for 4.6.1 to consume .NET Standard 1.5+ packages.

This is also why it was often recommended to jump from .NET Standard 1.4 straight to 2.0 at the time, and skip 1.5 and 1.6 entirely.

Nowadays that info is seriously buried/deleted, but here are some links I still happen to have containing some artifacts of this:

https://web.archive.org/web/20170817143725/https://docs.microsoft.com/en-us/dotnet/standard/net-standard

https://web.archive.org/web/20160416214114/https://github.com/dotnet/corefx/blob/master/Documentation/architecture/net-platform-standard.md

Finally, I brought up what @MichalStrehovsky said in the context that there are little known, unintuitive differences between .NET Standard 1.x and 2.x.

@MichalStrehovsky
Copy link
Member

Finally, I brought up what @MichalStrehovsky said in the context that there are little known, unintuitive differences between .NET Standard 1.x and 2.x.

To be clear, the differences I wrote about are pretty much:

class C
{
    // Exists in NetStandard X
    public static void Do(object x);

    // Added in NetStandard X + 1
    public static void Do(int x);
}

The code will compile in both NetStandard X and X + 1, but if compiled against NetStandard X + 1, Do(42) will call the more specialized and efficient overload. This will always be the case. More efficient overloads get added to the framework all the time and just bumping the target framework lets one pick up the benefit for free, without any edits to the C# source file. Sticking to an old one is compatible with old runtimes, but comes at a cost.

@LuisApplivity
Copy link

LuisApplivity commented May 16, 2024

[...] There was a "rollback" with .NET Standard 1.5, [...]

Minor correction, the "rollback" was with .NET Standard 2.0, in that it is a 100% API superset of .NET Standard 1.4, but left behind a very small number of .NET Standard 1.5 and 1.6 APIs.

I wish someone would find the exact Microsoft (GitHub?) link/discussion that addressed this. There was even mention that this decision was made, because they believed that the few 1.5/1.6-exclusive features would impact only a very small amount of existing applications.

@slang25
Copy link

slang25 commented May 16, 2024

A reason I try to avoid netstandard < 2.0 is that I get an explosion of packages. Am I remembering that right? For me this is reason enough to target ns2.0 even though I'm only using a subset.

@dotMorten
Copy link

dotMorten commented May 16, 2024

I think @MichalStrehovsky raises an interesting point about overloads.
But I still don’t quite get the argument. If we were to follow that, it would mean any time a new release of anything introduces new more explicit overloads we should start warning people not to use previous versions.

Copy link

@JonDouglas JonDouglas left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🚢🐿️

@jzabroski
Copy link

Agree.

Can't believe anyone would waste finger movements typing reasons to still target ns1.x but

@mgravell
Copy link
Member

mgravell commented May 17, 2024

Totally support the warning for < 2.0; as a minor observation:

and .NET Standard 2.0 is the only way to build a single binary that can be consumed by both .NET Framework and .NET Core.

Now that libraries are usually shipped via nupkg, this honestly doesn't interest me as a concern - shipping multiple binaries that work on the intended targets is just as simple as (perhaps simpler than) shipping a single binary. Personally, I've reached the point where when I see ".NET Standard", I internally read that as "exotic unsupported runtimes - things like Unity; if it works, it works". If my intention is to support .NET Framework, it is easier to be explicit about that and have a net472 / net48 target.

But on the question of adding a warning: 👍

@jzabroski
Copy link

It's easier to write poly fills when targeting netstandard2.0 than

<TargetFrameworks>net48;net8.0</TargetFrameworks>

I only use net48 for "startup projects" like console programs, desktop applications, web services, websites.

@LuisApplivity
Copy link

@jzabroski Please read the discussion more carefully: These were not reasons to target ns1.x, they were reasons not to issue a warning. Completely different meaning.

While there are and will be multiple good reasons to target it, as demonstrated above, we all seem to unanimously agree .ns2.0 is simply best in practice. However, as a subset, issuing a warning against ns1.x is simply senseless, even though that's exactly what's underway, as we can see.

@jzabroski
Copy link

jzabroski commented May 24, 2024

I did. In terms of SDK obsolescence etiquette, It is necessary but not sufficient to emit a warning discouraging target of pre-netstandard2.0.

By not supporting the warning, project developers need to pull in many dependencies, often with nuget semver package versions different than assembly versions, and debug assembly binding errors that most developers are untrained and unskilled in how to debug. I have supported two very popular open source projects where at least fifty issues have been logged related to developer confusion over targeting older versions of .NET Framework than 4.8. The reality is, Immo fixed this mess by halting iteration on netstandardX.Y altogether, and now he is closing this chapter completely for the best. I could go into depth discussing this and why wildcard transitive dependency resolution was a very bad idea in the first place, but it's a waste of time as nobody except Azure Functions users has submitted such issues to my projects since Immo cleaned all this up as part of .NET 5. Azure Functions is still a mess, but they can have their own little playground to mess around in. People who genuinely care can use chiseled containers on Azure Functions now and bypass a lot of the messy stuff.

@tannergooding
Copy link
Member

There's no practical difference in creating packages targeting 1.x vs 2.0.

It's worth noting there is significant difference, so much so that even if you wanted to support netstandard1.x and had no use for APIs in netstandard2.0 or later, we still explicitly recommend that users multi-target netstandard2.0 in that scenario due to the sheer number of facade assemblies and potential issues around type forwards that can show up for ns1.x only packages. We have a multitude of links to documentation on this topic and some of the quirks for the various 1.x targets listed under https://learn.microsoft.com/en-us/dotnet/standard/net-standard

There were growing pains and we really only got the quirks ironed out with .NET Standard 2.0. So targeting 1.x has a lot of rough edges and is something that users should only do with vary careful consideration, but even then you're more likely better off simply multitargeting the out of support .NET Framework versions at that point. And then as others have indicated, frameworks that are sticking on .NET 4.5 probably aren't rapidly upgrading dependencies anyways, so it is not unreasonable for the latest version of .NET to start warning. You can always ignore the warning, you can always continue supporting any such customers on an older version of your package that they are known to be on, etc.

Decisions like these are considered with respect to what's best for the entire ecosystem and not necessarily with what might be best for developers sticking around on versions that went EOL years ago, who aren't even using the latest version of the product (so will never see the warning), or for the increasing minority of libraries that are multi-targeting both.

@LuisApplivity
Copy link

LuisApplivity commented May 27, 2024

There were growing pains and we really only got the quirks ironed out with .NET Standard 2.0. So targeting 1.x has a lot of rough edges and is something that users should only do with vary careful consideration [...]

100% agreed.

We have a multitude of links to documentation on this topic and some of the quirks for the various 1.x targets listed under https://learn.microsoft.com/en-us/dotnet/standard/net-standard

Unfortunately, that link doesn't contain much to help others understand what is so different about targeting 1.x as opposed to 2.0, all it does is mention the higher dependency graph. Aside from that, all it contains is text recommending 2.0 and, at most, a footnote to point out .NET Framework 4.6.1 support is flakey, without specifying the actual reason (real reason was deleted long ago, as I showed earlier), which has to do with both 1.x and 2.0.

but even then you're more likely better off simply multitargeting the out of support .NET Framework versions at that point.

Like .NET Framework 4.6 etc.? Agreed for most cases. However, keep in mind .NET Standard 1.x is still useful for supported versions as well, such as .NET Framework 4.8 and .NET 8, because it's often useful to engineer your code and patterns within a subset, such as .NET Standard 1.x, and have it enforced by targeting it, as a tunnel that helps avoid undue complexity. Similar arguments also apply to setting <langVersion>7.3</langVersion> for C#, and other such things.

In other words, people interested in .NET Standard 1.x are also those who are targeting supported runtimes, as well, not only out-of-support ones, and a warning is simply an unnecessary nag and hindrance. Let's not forget, this warning practice has only been done in the past for unsupported runtimes such as earlier .NET Core, but all supported runtimes such as .NET 8 and .NET Fr. 4.8 implement and will keep implementing .NET Standard 1.x (and thus also 2.0).

You can always ignore the warning, you can always continue supporting any such customers on an older version of your package that they are known to be on, etc.

Until they remove it from a later version of Visual Studio as they did with some of the PCL (Portable Class Library) targets, and state the reason of removal as "we already were issuing a warning against it anyway". Then you need older VS versions, and other older tooling, but customer won't have or allow it, and so on. See the headache snowball? It starts with the size of a tiny pebble like this.

This also damages people's confidence in .NET Standard 2.0, and can/will be used as a precedent to warn/attack/remove it eventually, too, possibly some years later, despite being so much more useful than multi-targeting in most cases. It's the usual "cooking the frog" pattern, since ultimately Microsoft doesn't want multiple targets, but just their own tightly-controlled .NET (Core) platform, and only "accept" implementations that are forking from it from intimate partners, such as Samsung's Tizen.

It's good for Microsoft, but bad for people, the industry as a whole, and for developers in general. No benefit will come out of the issuing of warning. It's a bad practice. Relatively few people target .NET Standard as a whole, most don't even understand what it's about even to this day, let alone an older standard like 1.x, so the ones affected by the warning will be those people that do understand it and will have their reasons for using it.

@jzabroski
Copy link

it's often useful to engineer your code and patterns within a subset, such as .NET Standard 1.x, and have it enforced by targeting it, as a tunnel that helps avoid undue complexity

There are two kinds of complexity - internal and external. Newer versions of .NET generally simplify external complexity with respect to performance, while adding external complexity with respect to overloads (e.g., .NET 9 Span and variadic arguments improvements).

This also damages people's confidence in .NET Standard 2.0, and can/will be used as a precedent to warn/attack/remove it eventually, too, possibly some years later, despite being so much more useful than multi-targeting in most cases.

Microsoft and its competitors are commercial entities and will do what is necessary to stay in business while making the majority of its customers happy. The idea of having a "minimum version" to support was something the Android SDK team originally tried. As of 2021, Google set the global minimum version for new uploads to the Play Store to be 30.

Details on Android Levels

Important to note, Android Levels are not compile-time type-checked like .NET Standard is. The main problem with these Levels is they still require global compile-at-head behavior. Plus, after awhile, the regression suite required to backtest all the devices going back to the original "minimum version" is enormous, and very few developers in practice can actually reason about what the min-max API surface area even looks like, much less predict what will actually happen. The problem is so hard that Google built the world's most sophisticated buildchain tool, https://bazel.build/ , with explicit support for debugging and reporting transitive package references. And people still ship Play Store apps that rate 2 stars or lower. But the main point here is that Google, with its uber fancy build tool that explicitly manages crazy dependencies, still wants developers to use 30 as its minimum version number. That is years of API levels that are obsolete. (I believe the Android decision coincided with ARM processor architecture adoption, allowing Mac developers to use Android Studio on an M1 computer.)

The only API that has not seen much deprecation that is widely used commercially is JavaScript in the browser. There, there is no standard, and developers detect features and use progressive enhancement or graceful degradation or informal minimum version as enforced by either choice of stylesheeting technology or web development framework. However, the behavior is relatively standardized now due to webkit engine being open source and all major browsers implementing it as the engine.

@teo-tsirpanis
Copy link
Contributor

it's often useful to engineer your code and patterns within a subset, such as .NET Standard 1.x, and have it enforced by targeting it, as a tunnel that helps avoid undue complexity

I don't understand this argument. Complexity in a codebase originates from the overall structure of the code, not the specific external APIs it uses, or even is able to use. There are better ways to restrict yourself from using certain APIs if you want to, like the banned API analyzer, team code review, or just self-discipline.

Besides the transitive dependency explosion, targeting .NET Standard 1.x limits you so much in both the inbox APIs and the third-party packages you can use, and combined with the fact that all frameworks supported by NS1.x and not NS2.0 are unsupported, I don't see any valid technical reason to use NS1.x in a maintained codebase.

@LuisApplivity
Copy link

I don't understand this argument. Complexity in a codebase originates from the overall structure of the code, not the specific external APIs it uses, or even is able to use. There are better ways to restrict yourself from using certain APIs if you want to, like the banned API analyzer, team code review, or just self-discipline.

You are definitely correct in that self-discipline, team code review and "banned API analyzer" all aid to this goal, but all those 3 have a cost, and can nonetheless be used together with NS1.x, rather than in its place.

For example, some team members won't have that much self-discipline, and/or can be relieved from some of it via the restriction. Setting up "banned API analyzer" costs some time to do, and also adds some complexity, it's not as straightforward and simple as just merely targeting NS1.x. Team code reviews are indispensable, but not free, as they eat up production time, in some workforces way more than others, however by keeping the API surface thinner, you simplify the code reviews, and remove spent time overhead by nature, and the longer the project, the greater the benefit of doing so. Some people even get fed up and leave certain companies due to excess regularity or time spent in code reviews, so the more streamlined it is, the more we also avoid this problem, so we get all kinds of real-world benefits by following a "less is more" principle.

That being said, thank you for pointing out "banned API analyzer", that's an excellent resource I hadn't yet come across that I will definitely make use of here on out.

Besides the transitive dependency explosion, targeting .NET Standard 1.x limits you so much in both the inbox APIs and the third-party packages you can use, and combined with the fact that all frameworks supported by NS1.x and not NS2.0 are unsupported, I don't see any valid technical reason to use NS1.x in a maintained codebase.

Personally, despite everything, I like to target NS2.0 myself by default, not 1.x. That being said, that is just me and you (and others who do the same as us). By that, I mean to say other people choosing 1.x will have their reasons, we cannot anticipate all of them, the real-world market is incredibly varied with all kinds of use cases and individuals, and it's those individuals that should be making the calls, without any discouragement or warnings, because they understand their business and their environment as insiders, rather than us.

@tannergooding
Copy link
Member

however by keeping the API surface thinner, you simplify the code reviews

This is very subjective and quite often not the case.

Minimizing the standard library surface area can theoretically make code reviews smaller, but in practice don't because it really comes down to a very simple steps of:

  • Do I need to do 'x'?
    • If yes, does the BCL provide an API in box?
      • If yes, great, nothing further to do
      • If no, well now you have to go write your own implementation
    • If no, then an API existing or not existing doesn't matter

So in practice, there is very little that targeting a more minimal surface area gives. You might see slightly fewer items in IntelliSense, but that's really not going to be distinguishable for typical users since the bulk of the API surface is in namespaces like System, System.Collections.Generic, System.IO, System.Linq, System.Net.Http, System.Threading, and System.Threading.Tasks anyways (which are also the default using statements new files get access to by default).

Instead, what ends up happening is that you have less functionality in box so PRs end up being larger and needing more scrutiny due to having to polyfill all the bits that are missing. Then, when it actually comes to platform support you aren't really winning anything anyways.

For reference, the number of APIs available and supported target platforms is:

  • NS1.0 - 7949/37118 APIs, .NET Core 1.0+, .NET Framework 4.5+, Mono 4.6+, Xamarin.iOS 10.0+, Xamarin.Mac 3.0+, Xamarin.Android 7.0+, UWP 8.0+, Unity 2018.1+
  • NS1.1 - 10239/37118 APIs
  • NS1.2 - 10285/37118 APIs, .NET Framework 4.5.1+, UWP 8.1+
  • NS1.3 - 13122/37118 APIs, .NET Framework 4.6+, UWP 10.0+
  • NS1.4 - 13140/37118 APIs, .NET Framework 4.6.1+
  • NS1.5 - 13355/37118 APIs, UWP 10.0.16299+
  • NS1.6 - 13501/37118 APIs
  • NS2.0 - 32638/37118 APIs, .NET Core 2.0+, Mono 5.4+, Xamarin.iOS 10.14+, Xamarin.Mac 3.8+, Xamarin.Android 8.0+
  • NS2.1 - 37118/37118 APIs, .NET Core 3.0+, Drops .NET Framework, Mono 6.4+, Xamarin.iOS 12.16+, Xamarin.Mac 5.16+, Xamarin.Android 10.+, Drops UWP, Unity 2021.2+

Where .NET 5+ expands this significantly and each subsequent version of .NET (6, 7, 8, and even 9 when it ships this November) is effectively a new standard baseline that has expanded upon this further. https://apisof.net/diff can be used to see what APIs were added or changed between different versions of .NET, which can be quite useful to find out what has been added or changed.

So given that, by targeting NS2.0 you're potentially missing out on .NET Framework 4.5-4.6 support, legacy Mono/Xamarin, and UWP 8.0-10.0 support (all of which are unsupported, no longer getting bug fixes, etc) and significantly restricting the surface area available thus requiring you to do more overall work. All while also giving a worse consumption experience to downstream consumers of your package, causing their deployment size to increase drastically due to the number of facade assemblies required, and reducing performance/startup/compilation time due to the number of additional references that exist.

By that, I mean to say other people choosing 1.x will have their reasons

Indeed, and such users who believe they are in such a specialized scenario can suppress the warning. Ecosystems have a need to move forward and part of that includes surfacing diagnostics to users around things that are obsolete/deprecated and which may be problematic. Users have the freedom to ignore that warning, but they are then opting into the risks and downsides that come about from that.

@AustinWise
Copy link
Contributor

From the design:

The warning should be suppressable via the NoWarn property.

So if people know they really want to target .NET Standard 1.x they can easily suppress the warning.

Personally I think this warning is a good thing. I have found .NET Standard hard to explain to people. The guidance on which version of it to target has changed over time. So I would not be surprised if there are people using the old version without understanding the implications.

@jzabroski
Copy link

@LuisApplivity The key thing I caught you say was:

Personally, despite everything, I like to target NS2.0 myself by default, not 1.x. That being said, that is just me and you (and others who do the same as us). By that, I mean to say other people choosing 1.x will have their reasons, we cannot anticipate all of them

Everyone who is an expert on .NET pretty much can and does know the list of problems people would run into targeting 1.x and the lift addressing it - and it's pretty huge - think 10,000 person-hours of labor:

There would be a lot of actual work to make me want to use .NET Standard 1.6 or lower. Off the top of my head, assuming we ignore the polyfill challenges with Async Interfaces, and other libraries:

  1. Microsoft Docs. Figuring out what the lowest "standard" version a given API supports is time consuming and clicky-tappy. It's doable, but if you select ".NET Standard 1.6" for an API that does not support .NET Standard 1.6, Microsoft Docs forwards you to the latest major version of (modern) .NET.
  2. Libraries. Joel Verhagan has written a nuget.org package analyzer (https://github.com/nuget/insights) and a kusto query language front-end to write custom queries - the reality is the incoming velocity to .NET Standard <=1.6 is extremely low.
  3. In-box vs. out-of-box. With .NET Standard 2.0, there was a push to move more Microsoft-sponsored packages "in-box" (batteries included), so that you just needed the right TargetFramework and SDK and you would get access to a lot of packages without needing to declare dependency versions. With the introduction of tree-shaking (dead code elimination) for slim libraries, you get a whole bunch of libraries and then the compiler shakes out what you don't need.
  4. Hosting and Host Builders. With .NET Core App 3.0, .NET 5, and .NET 6, there were progressive iterations to building application hosts to run executables, and while these are end-point concerns, normally a .NET Standard 1.6 library would write helpers for host builders as part of Microsoft DI support - and so targeting .NET Standard 1.6 and lower is a pain because you'd need to ship two sets of binaries for DI support. It can be done, but it's double the binaries, double the documentation, and more than double the support requests/bugs.
  5. Open Telemetry Support - Any modern big development budget app is going to want to have the benefit of distributed tracing and monitoring, and you don't get that with .NET Standard 1.6 or lower. There would have to be backports not just from Microsoft, but key partners like Datadog.
  6. Loss of performance optimizations - Insurmountable? Some things in modern .NET just require modern technology to work the way users expect at scale - and unethical in an era of climate change to push users to inferior performance. API Analyzers allow you to workaround this insurmountable obstacle by curtailing whatever APIs you don't want to use. There are big companies like LL Bean that have used tools like NDepend to analyze APIs and build modular applications, even before API Analyzers existed.

For all these reasons and the ones already mentioned, nobody actually wants to target .NET Standard 1.6 or lower. Really. The ergonomics are terrible.

@JonDouglas
Copy link

For what it is worth, .NETStandard 1.x is < 2%(all versions combined) of assemblies on NuGet.org. .NETStandard 2.0 is ~23% and .NETStandard 2.1 is ~2%. Just a quick query, but hopefully helps in this conversation.

In addition, people can use NuGet.org's newish Search by TFM feature to help find specific packages w/ those assets.

.NET Standard 1.X:
image

.NET Standard 2.0:
image

.NET Standard 2.1:
image

@LuisApplivity
Copy link

LuisApplivity commented May 29, 2024

I believe we all made as many meaningful contributions to the main discussion as we could by this point, and hopefully it will assist future readers into making better-informed decisions.

On a mostly-unrelated note, and I emphasize such an offtopic matter is not being brought up by myself, but rather I'm simply answering it:

[...] and unethical in an era of climate change to push users to inferior performance. [...]

Oh, please, spare us your hypocrisy: With each release of .NET, there are more and more platforms dropped and made incompatible. Windows 11 itself dropped support for a gigantic amount of hardware, encouraging it all to go to the landfill, and is not alone in this deed (look at Apple, Samsung etc.), so the more .NET goes "forward", the less eco-friendly it gets. You want eco-friendly .NET? Use Mono and MonoDevelop. Not to mention the ever-increasing bloat and telemetry, both in the newer OSes and newer .NET, which not only waste CPU cycles, but also is frequently used in a privacy-infringing manner. I'm not sure how you could have made your "climate change" remark any more ironic.

You cannot accurately put ".NET" and "ethical" in the same sentence.

You would also never want to waste CPU cycles on non-natively-compiled platforms. Unless all you do is AOT (ahead-of-time) compilation, try ANSI C89 / ISO C90 for actual true cross-platform, universal OS, compiler and hardware support. You cannot go more "green" than that. Remember: 1. Reuse, 2. Repair, 3. Recycle, in this exact order.

Furthermore, since we are already having this ridiculous discussion: as previously pointed out, an older OS like Windows Vista SP2 supports .NET Standard 1.3 maximum, and it, like Windows versions up to (non-updated) Windows 7 SP1, is many times more eco-friendly and ethical, because it doesn't have, out of Microsoft's own self-admittance, most of the keyloggers and other spyware/malware built into later versions of Windows itself, all of which make sure every single one of those CPU cycles are not only wasted, but used unethically and to our individual and collective detriment.

So by targeting .NET Standard 1.3, hey, guess what, you at least can use your Class Library project in one more OS that isn't as blatantly unethical and polluting as the only OSes modern .NET can target. For Windows XP support, even .NET Standard won't do, but PCL will, and so based on your line of argumentation, we should reinstate full PCL support in the next Visual Studio release. Because "climate change".

@terrajobst
Copy link
Member Author

terrajobst commented Jun 1, 2024

Thanks everyone for your feedback!

I've added a few more explanations in 83157fd which addresses two major points:

  1. What's the downside of targeting .NET Standard 1.x
  2. Should you drop .NET Standard altogether.

I understand that some of you might not like the warning because you have requirements that make you keep targeting .NET Standard 1.x. That's why the warning is suppressible and just like calling OS-specific APIs, intentional suppression is very much part of the design. In the end, our goal is to tool the guidance that makes sense for the vast majority of our customers. Given complexities around most features, no guidance is going to be true in every case; I believe not providing tooling would be a disservice to our community because it basically means a minority of advanced users holds the simpler experience hostage, which is generally not aligned with the design philosophy of .NET.

Hence, I'm going to merge this now.

@terrajobst terrajobst merged commit 7e18508 into dotnet:main Jun 1, 2024
2 checks passed
@terrajobst terrajobst deleted the net-standard-recommendation branch June 1, 2024 21:10
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.