On Thu, Nov 27, 2014 at 9:15 PM, Gavin Lambert
On 28/11/2014 14:08, Vicente J. Botet Escriba wrote:
Le 27/11/14 23:44, Gavin Lambert a écrit :
I'm not sure I understand the meaning of having an order that isn't implied by op<. If op< is omitted because comparison is meaningless or otherwise confusing, how could it be well-defined how to sort them? If there is not a single well-defined sort order, then the library should not just pick one arbitrarily, and if there is, then why would it not be appropriate as an op
There are a lot of kinds of orders (http://en.wikipedia.org/wiki/Total_order). Some of us would like that T < T mean the same kind of order thing independently of T. As we have already int < int that represents the natural order it follows that T < T should mean then the natural order. We can then have a lot of other orders that don't use this specific syntax. The order needed by the ordered containers is not the natural order but a strict week order ( http://en.wikipedia.org/wiki/Weak_ordering#Strict_weak_orderings ).
That doesn't address what I was referring to. My point was that if there is disagreement on where "none" should sort, or how "complex" should sort, then the library should not define these as default sorts (via op< or std::less). Instead it should either provide standard sorting predicates for the common options or just leave it up to the application to decide how it wants to sort things.
Why? I've heard this stated before regarding ordering and frankly I think
it's nonsense. Whenever you design a library you're going to make choices
that don't have an objective correct or incorrect option, even separate
from the issue of ordering. This can be anywhere from simple things like
naming or parameter order to the higher-level semantics of your types. The
library developer frequently makes subjective decisions and that's
perfectly fine -- I'd even say it's a good thing. The fact that there are
multiple valid solutions does not at all imply that you should avoid
choosing any of them, especially in this case where we are talking about a
default that doesn't even have to be used. If, in the case of a default
ordering, that default isn't what's desired for a specific situation, then
the user can just be explicit there. Do you have a problem with the fact
that tuples have a default ordering? What about strings? These are not
problematic in practice, nor, I'd argue, are they problematic in theory.
Just to make things a little more grounded with respect to optional and
variant, consider what happens when you provide a default ordering:
First, there are many cases where someone just wants a default ordering,
regardless of what that ordering may be (I.E. to use the type in a set), as
long as the default isn't doing something like creating unnecessary
equivalence classes. In this case, it doesn't matter that the library
designer chose the default. Any of these orderings are acceptable for the
user. Alright, so what if, in a particular case, the default ordering isn't
what a user personally wants? In that case, the programmer would just use
an explicit ordering when using I.E. set. Note that this case is no
different from what the user would have to do if no default ordering were
provided. Finally, what if someone sees the default ordering used in code?
Because it might not be immediately clear to the user, they'd simply look
up what the library specifies, assuming they need to or want to know. I
don't see how any of these situations are problematic.
Okay, so what happens if you reverse the situation and don't provide a
default ordering:
First, people can no longer use the type in datastructures like sets unless
they provide an explicit ordering, even if they don't care what the
ordering is. So here the user needs to do more to accomplish the same end
result. What exactly does the user gain? Is it just that if someone else
sees a default ordering used, they might have to investigate what that
default ordering is assuming they need to rely on it? If so, how is this at
all different from the user seeing any function used that they don't
immediately know the semantics of? In either case, all they'd do is look it
up. Just because the function happens to be called
std::order/std::less/operator< shouldn't make a difference.
As well, when you specify a default ordering, users can often take
advantage of it in a way that makes it applicable to their situation and
they retain the benefits of there being default. As an example of this, on
one occasion I used a variant to store the rank of a 5-card poker hand. In
other words I had the following:
variant
optional<T>) and an implicit constructor both exist, then op<(optional<T>,T) will automatically exist even if not explicitly defined. Since the discussion related to explicit poisoning that didn't seem worth mentioning.
That's not true. Assuming that you are defining the operator as a template, and not, for instance, as an inline friend of the class template, then the implicit conversion there will not take place unless the template argument is explicitly provided.
Defining op< for containers was a necessary evil while std::map was the only standard associative container, and it happened to require ordered keys. Now that unsorted associative containers exist (std::unordered_map and its Boost equivalent for pre-C++11), I don't think its existence can be justified any more.
WHAT!? The choice of whether to use an ordered or unordered associative container should have nothing to do with this. Period. They are entirely different data structures with different complexities and the issue of defaults is entirely orthogonal. However, while we're on the subject, I think it's really important to point out that the unordered containers are also taking advantage of a default that by your, and others', logic, shouldn't exist -- the default hash. Just like with ordering, there is any number of equally valid hashes that can be associated with a given type. Why is one better than any other? I don't see why anyone would say that it doesn't make sense to have a default ordering but yet it does make sense to have, and rely on, a default hash. -- -Matt Calabrese