Home > Article > Backend Development > Why Does Pointer Decay Affect Overload Resolution in C Function Templates?
Pointer Decay and Function Overload Resolution
In C , overload resolution aims to select the best-matching function for a given set of arguments. When multiple functions are viable candidates, the one with the minimum conversion cost is preferred.
Consider the following function template that prints the length of a character array:
template <size_t N> void foo(const char (&s)[N]) { std::cout << "array, size=" << N - 1 << std::endl; }
When calling foo("hello"), it successfully identifies the template specialization and outputs "array, size=5". However, extending foo to support non-array scenarios introduces an ambiguity.
void foo(const char* s) { std::cout << "raw, size=" << strlen(s) << std::endl; }
Now, calling foo("hello") surprisingly prints "raw, size=5", even though the template specialization seems like a more precise match.
The Reason for Ambiguity
The ambiguity arises because an array is essentially a pointer to its first element, making an array-to-pointer conversion inexpensive. According to C overload resolution rules, an overload that requires fewer conversion operations is favored. In this case, the array-to-pointer conversion is a low-cost Lvalue Transformation that ranks higher than the necessary template argument deduction.
Working Around the Ambiguity
To ensure that the array function overload is invoked, a workaround is to define the non-array overload as a function template as well:
template <typename T> auto foo(T s) -> std::enable_if_t<std::is_convertible<T, char const*>{}> { std::cout << "raw, size=" << std::strlen(s) << std::endl; }
This ensures that the template specialization is prioritized because partial ordering kicks in.
The above is the detailed content of Why Does Pointer Decay Affect Overload Resolution in C Function Templates?. For more information, please follow other related articles on the PHP Chinese website!