-
Notifications
You must be signed in to change notification settings - Fork 205
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FEA]: Thrust-NVRTC Support #4305
Comments
Thanks @lamarrr. I think to down scope this a bit, it would helpful if you could go through and do a pass first of looking at what headers and symbols in libcudf can just be replaced with For example, without actually looking, I believe all of the following have direct replacements in
Naively, I believe all that would really remain would be making Thrust's fancy iterators (or replacements in |
AFAIK, all Thrust iterators already work under NVRTC since at least CCCL 3.0, see #3676. Furthermore, if you only need Thrust's sequential algorithm implementations, you will be able to get by with just Once #3741 lands, let's narrow down your list to what's still missing. |
Agreed. Most of the issues should be solved if we use the |
Is this a duplicate?
Area
Thrust
Is your feature request related to a problem? Please describe.
CUDF is adopting JIT for kernel compilation via JITIFY/NVRTC.
NVRTC unlike NVCC requires that the source files only contain device code. No host code or headers, even if they aren't used. Support for it is not planned either. This has prevented us from adopting JITIFY for our device kernels, as our dependencies don't support this use case either.
We need Thrust to support inclusion from offline-compiled device-only JITIFY code.
Describe the solution you'd like
thrust::seq
) when in JITIFY modepthread.h
). Ideally, it should only includecuda/std/*
headersDescribe alternatives you've considered
Patching thrust to support JITIFY code - Not feasible or practical
Additional context
Here are some of the thrust headers we use in CUDF's device code:
The text was updated successfully, but these errors were encountered: