Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Probably wrong complexity of std::includes #2082

Closed
pisarik opened this issue May 24, 2018 · 5 comments
Closed

Probably wrong complexity of std::includes #2082

pisarik opened this issue May 24, 2018 · 5 comments

Comments

@pisarik
Copy link

pisarik commented May 24, 2018

I started from the question at stackoverflow.

Here stated that algorithm std::includes has complexity at most 2 * ((last1 - first1) + (last2 - first2)) - 1 comparisons.

But it seems the algorithm has complexity at most 2 * (last1 - first1) comparisons. Probably this mistake arose from complexity of operations std::union and std::intersection.

It would be cool to see a nice answer at stackoverflow on it.

@timsong-cpp
Copy link
Contributor

This is not an editorial issue; it needs to go to LWG.

@zygoloid zygoloid added the lwg Issue must be reviewed by LWG. label May 24, 2018
@pisarik
Copy link
Author

pisarik commented May 24, 2018

Should I create an issue here issues.isocpp.org ?

@zygoloid
Copy link
Member

I believe this issue is incorrect. Remember that either range can contain duplicate elements. Consider:

int haystack[] = {1};
int needles[] = {1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1};
includes(begin(haystack), end(haystack), begin(needles), end(needles));

Clearly we cannot give a complexity bound better than last2 - first2 comparisons.

@zygoloid zygoloid removed the lwg Issue must be reviewed by LWG. label May 24, 2018
@timsong-cpp
Copy link
Contributor

See http://lists.isocpp.org/lib/2018/05/7092.php; includes is actually checking for subsequence-ness. Once you exhaust haystack there's no point going through the rest of needles.

@zygoloid
Copy link
Member

The complexity requirements are consistent with the current description, and should be fixed if/when the description is fixed to actually describe what the algorithm is "supposed' to do. Seems like LWG is already on the case. I'll reply to that lib thread to request that the complexity be revisited when the spec is fixed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants