-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Proposal]: Overload Resolution Priority (VS 17.12, .NET 9) #7706
Comments
So, the idea is, should I consume a DLL with obsolete members, that I can swap out a dynamically invoked DLL with a newer version, but I wouldn’t be able to compile against the newer one? In other words, the idea is to allow types that are visible to the runtime but not the compiler? |
Precisely. |
Why not just add a flag to |
@TahirAhmadov I'd imagine it's easier to query whether this attribute exists than to query whether a property on it exists and then question whether its value is |
The compiler already queries for More importantly though, logically a 3-state |
This would also have the added benefit of pushing developers to check over all their existing Obsolete attributes. |
@TahirAhmadov You might be right, I'm a bit torn on this but for whatever reason they chose to go with an attribute so maybe they can share their clarification in a comment or update the OP. :) |
Since the main motivation for this is to resolve overload resolution problems, I propose this be resolved by changing Idea 1: Make Obsolescence Affect Overload ResolutionWe define an "obsolete context" where a containing type or member is obsolete. In this context, overload resolution is treated the same as it is today. Outside of an obsolete context, obsolete members are treated as a last resort for resolution, perhaps even after extension members with the same signature. Idea 2: Add a Boolean Property to ObsoleteAttributepartial class ObsoleteAttribute
{
bool BinaryCompatOnly { get; set; }
} or an enum like the other commenter suggested. |
Would the compiler resolve the annotated member as a last resort, for the purposes of providing an error message? Or would the member be effectively completely invisible to the compilation process? I think it does make sense to evolve the ObsoleteAttribute rather than creating a new one. My one concern would be existing compilers wouldn't know to look for these changes, but as long as setting those new properties or whatever also results in |
Sign me up. This would be really useful for me. Paint.NET's plugin system is almost 20 years old, and sometimes it's quite difficult to modernize things, or to get plugin authors to use the correct (new) systems, when all the old stuff is still littering the namespaces and Intellisense. For example, I have two BGRA32 color/pixel structs: Here's another example. Let's say you're writing a library of primitives for things like vectors, matrices, colors, etc. Sometimes these should be explicitly castable to other primitives, other times implicitly castable. But sometimes you make a mistake and an public readonly struct Matrix5x4
{
private readonly Vector4 row0, row1, row2, row3, row4;
...
public static implicit operator Matrix4x4(Matrix5x4 matrix) => ...; // OOPS 😢
} I mention this because I actually made this type of mistake recently 😢 I got lucky -- no plugins were using the cast operator, so I just changed it. It'd be nice to have this capability: [BinaryCompatOnly]
public static implicit operator Matrix4x4(Matrix5x4 matrix) => (Matrix4x4)matrix;
public static explicit operator Matrix4x4(Matrix5x4 matrix) => ... BTW, extension methods already have a pseudo |
My concern with this is that the BCL may want to do this in libraries that are still targeting netstandard2.0. It's certainly something we can discuss though. |
@333fred isn't it the same issue with either adding a new attribute or modifying an existing one? Please elaborate... |
No, it is not the same. You could polyfill BinaryCompatAttribute. |
Hmmm... If we were to change how How about modifying Or just do both - modify These options may sound too complicated, but it just feels so right to modify |
Absolutely not, but I think you may have known that 😛.
That doesn't seem preferrable to me. That seems way more complex. |
@333fred Couldn't such libraries still define an internal System.ObsoleteAttribute and silence the warning that a compilation type is being preferred over a conflicting library type? |
@jnm2 I think it's possible to create a Nuget which just adds source to the project - not a referenced DLL, right? Then this "internal" |
Moving from #7707 (comment), one idea to support type-forwarding these hidden types is to add a command-line option for a line-delimited file with the assembly-qualified names of the types to forward. The compiler will check that these types exist in the referenced assembly, and emit entries in the |
I could be misremembering, but no, I don't believe they could, just like (iirc) you can't do that with the |
using System;
namespace ObsoleteTest
{
internal class Program
{
static void Main(string[] args)
{
Console.WriteLine("Hello, World!");
}
}
[Obsolete(ObsoleteLevel.BinaryCompatOnly)]
class Foo
{
}
}
namespace System
{
enum ObsoleteLevel { Warning, Error, BinaryCompatOnly };
class ObsoleteAttribute : Attribute
{
public ObsoleteAttribute(ObsoleteLevel level = ObsoleteLevel.Warning) { }
}
} It causes this currently: IDE shows this in stacktrace: Looks like this is not currently supported, but surely this can be made to work? The legacy netstandard2.0 is unchanged, but the compiler can be changed to accommodate this, right? |
Please no ... I don't think you can even add custom command-line args for the C# compiler in Visual Studio |
@rickbrew it can be integrated with the project files like this:
<PropertyGroup>
<TypeForwards>forwards.txt</TypeForwards>
</PropertyGroup>
Or this: <ItemGroup>
<TypeForward Include="MyNamespace.MyObsoleteType[MyObsoleteAssembly]" />
<TypeForward Include="MyNamespace.MyObsoleteType2[MyObsoleteAssembly]" />
</ItemGroup> |
Okay that would work :) |
Maybe it's a stupid idea - but can we re-use |
I think we should make it that a class in our assembly hides all external classes with the same full name. (Those classes could still be accessed with extern alias, if necessary.) |
@ds5678 I am unsure what you mean by that or how it applies here. |
We never take that approach for attributes we expect users to apply directly, and I wouldn't expect that to change with this proposal. |
I wouldn't conflate this feature with the Obsolete attribute, they seem orthogonal. |
@AraHaan Like with other statically-analyzed attributes, libraries can declare their own copies of the attributes as internal with |
Yeah, but I would rather it be compiler generated for all TFMs though. Mainly because there are tons already that is already compiler generated for all TFMs anyways nowadays so what is the problem with 1 more? |
As I previously stated, the compiler does not generate attributes we expect users to apply by hand, and this policy will not change here. There are lots of footguns for polyfilling these attributes on lower TFMs, and we do not support the scenario. |
Is it yet decided whether this will allow overloading by return type? If so, we need new reflection APIs (e.g. |
Technically the CLR has always allowed for overloading by return type, as the return type is a part of the method signature. I'm not aware of any languages that take advantage of the capability, though. |
@HaloFour Yes, I know. I'm asking if this proposal will allow overloading by return type in C# as well, with one of the overloads having a higher priority and thus always being the one that is picked. Then the need for getting a method not only by parameter types but also by return type in reflection will be greater. And it would be good to get this ability in the same .NET release. (Although I think having the ability to specify return type in |
The proposal only affects overload resolution. Meaning which overload is picked for a particular call. It doesn't change at all the rules about what overloads can be declared. |
@CyrusNajmabadi And is there no plan to allow such overloads to be declared? From the way this feature (or the previous one - BinaryCompatOnly) was advertised and talked about by Jared and others, they mentioned the possibility of overloading by return type thanks to this feature, if I recall correctly. |
We'd have to update this proposal then in taht case, or add a new proposal for that :) I'm not opposed to it. It just doesn't seem to be part of what is being designed right here, right now. Perhaps for the future. |
(I tried to find previous discussion of this question, but I didn't see any. If this has already been talked about somewhere, please point me to it.) Given this bit from the proposal summary about the feature's intended use —
(emphasis mine) — and given this example from § Motivation —
(emphasis mine) — what happens when (1) an older C# compiler, (2) a C# compiler targeting an older language version, or (3) a language that does not understand this attribute encounters a library that depends on this feature? Will it be impossible to consume an overloaded API whose only disambiguator is this attribute in such a case? Or am I misunderstanding the implications here? |
At the risk of stating the obvious, such an attribute would presumably never be the only disambiguator between overloads. It only helps with the decision making when there are ambiguous non-exact matches (implicit casts). Adding explicit casts so that your invocation exactly matches a (non-binary-compat-only) overload should succeed, in C# as well as other languages. In other words, if you can change the invoking code, you can always make it work. Without changing the code, when a library introduces back-compat-only overloads I'd expect it to risk breaking on an older compiler and for a language that does not understand the attribute. |
If there would be a way to prioritize an instance method lower than any unannotated extension method, this could be the way out of the dilemma in dotnet/runtime#18087 (comment). |
Hi all, can somebody please explain why this is the case? I mean "programs which previously COMPILE would now NOT COMPILE" is obviously bad, no question asked. What I mean is that every piece of code that tries to use features introduced only later (eg: trying to use Thanks! |
You're right, this direction should not be problematic. However, changing |
Thank god finally, I was going a lil bit insane. It has been told me multiple times that it was a problem, and I was not able to understand why 😅
Since we are here, can you give me one example of this?
I'm asking because I fail to see (again, my bad) how that could be, not in general but in this very specific case. The change we are talking about is such that code that was compiling before would not stop compiling, only vice versa (and we agreed that is not a problem). How can "ignoring from overload resolution of a specific call site a specific overload that cannot be called from that specific call site anyway" break an existing piece of code? Just to be clear, I'm not saying I'm necessarily right, I'm asking if it's possible to see an instance of such a case to better understand it. Thanks for your time. |
@jodydonetti I'll point out that you are quoting from comments on the previous version of this proposal, not the current version. However, yes, it is is a problem when code that previously didn't compile now compiles; what that often means in practice is that code that previously compiled because instance member overload resolution failed, and an extension was chosen instead, now chooses an instance member, which is a silent breaking change. The version you're talking about here also had a lot of problems around inheritance and testability, which is why we moved to |
Sorry about that, it was not intentional.
Let's use an MRE (.net 9). var foo = new Foo();
foo.Bar();
public class Foo
{
[Obsolete("Use Bar(int) instead.", true)]
public void Bar(bool b = false)
{
}
public void Bar(int i = 0)
{
}
}
public static class FooExtensions
{
public static void Bar(this Foo foo)
{
}
} This does not compile, and even if the overload resolution failed on instance members and they cannot be called, the extension is not chosen instead. Also, one thing I'm not getting this part:
We were talking about code that did not compile, not code that did. Am I missing something?
If you have some time to spare, can you point me to an example of that? I'd like to understand more and haven't found one by looking at the other comments. A pratical exampleTo shorten the back-and-forth and save time on your side (sorry), here is the MRE for which I'm not able to wrap my head around. var foo = new Foo();
foo.Bar();
public class Foo
{
public void Bar(int i = 0)
{
}
} This compiles: the compiler picks the only one, easy peasy. var foo = new Foo();
foo.Bar();
public class Foo
{
private void Bar(bool b = false)
{
}
public void Bar(int i = 0)
{
}
} This compiles (note var foo = new Foo();
foo.Bar();
public class Foo
{
public void Bar(bool b = false)
{
}
public void Bar(int i = 0)
{
}
} This does NOT compile: the compiler cannot pick one callable from that call site, since 2 are both callable and compatible. var foo = new Foo();
foo.Bar();
public class Foo
{
private void Bar(bool b = false)
{
}
//public void Bar(int i = 0)
//{
//}
} This does NOT compile (note So it knows that that overload is not callable from the specific callsite. var foo = new Foo();
foo.Bar();
public class Foo
{
[Obsolete("Use Bar(int) instead.", true)]
public void Bar(bool b = false)
{
}
//public void Bar(int i = 0)
//{
//}
} This does NOT compile: the compiler sees only one, but that is NOT callable from that call site because of obsolete+error. So it knows that that overload is not callable from the specific callsite, much like a private overload. And now the curved ball. var foo = new Foo();
foo.Bar();
public class Foo
{
[Obsolete("Use Bar(int) instead.", true)]
public void Bar(bool b = false)
{
}
public void Bar(int i = 0)
{
}
} This does NOT compile: the compiler sees 2, and knows that one is NOT callable from that call site because of obsolete+error, but it does not ignore it. The compiler knows that from the specific call site it cannot call a private one, so it ignores it, but doesn't do the same for the obsolete+error, even though both are not callable from that callsite, so much so that when there's only the obsolete+error one, it does not compile. How can it not have a doubt about overload resolution when the overload is private, but have it when it's obsolete+error? This is the part where I can't see how it can make sense. Sorry for the long post, I wanted to be as clear and precise as possible. |
Code that doesn't compile can have downstream impacts that can cause something else to compile in a different manner. These are often particularly esoteric scenarios, true, that usually have to go through multiple levels of overload resolution and lambda binding, but it can happen. In the simple case where we must be talking about an ambiguity, there is indeed nothing an extension would change, but these types of subtle overload resolution changes can pop up in unexpected places and it may not always surface as an ambiguity. I don't feel confident enough to state for sure that marking something The previous version of the proposal, and a brief sentence as to why it was abandoned, is at the end of the current proposal: https://github.com/dotnet/csharplang/blob/main/proposals/overload-resolution-priority.md#alternatives.
Because private members don't participate in overload resolution at all. The way the compiler implements overload resolution here is actually 2 passes: in the first pass, it does overload resolution respecting visibility rules. When this pass fails and couldn't find anything to look at, we then do a second pass, not respecting visibility rules; this pass is not required by the specification at all. We just want to give a more helpful error in these cases. On the other hand, members that are marked |
Summary
We introduce a new attribute,
System.Runtime.CompilerServices.OverloadResolutionPriority
, that can be used by API authors to adjust the relative priority ofoverloads within a single type as a means of steering API consumers to use specific APIs, even if those APIs would normally be considered ambiguous or otherwise
not be chosen by C#'s overload resolution rules.
Motivation
API authors often run into an issue of what to do with a member after it has been obsoleted. For backwards compatibility purposes, many will keep the existing member around
with
ObsoleteAttribute
set to error in perpetuity, in order to avoid breaking consumers who upgrade binaries at runtime. This particularly hits plugin systems, where theauthor of a plugin does not control the environment in which the plugin runs. The creator of the environment may want to keep an older method present, but block access to it
for any newly developed code. However,
ObsoleteAttribute
by itself is not enough. The type or member is still visible in overload resolution, and may cause unwanted overloadresolution failures when there is a perfectly good alternative, but that alternative is either ambiguous with the obsoleted member, or the presence of the obsoleted member causes
overload resolution to end early without ever considering the good member. For this purpose, we want to have a way for API authors to guide overload resolution on resolving the
ambiguity, so that they can evolve their API surface areas and steer users towards performant APIs without having to compromise the user experience.
Design Meetings
Related Issues
The text was updated successfully, but these errors were encountered: