@vectorize works for 'cpu' and 'parallel', but not for 'cuda'
up vote
-2
down vote
favorite
I'm using Numba module for Python, namely, the @vectorize
decorator.
The signature is the following:
@vectorize(['float64(int64,float64,float64,int64,int64,float64)'], target='cuda', nopython=True)
def calculate(a, b, c, d, e, f):
# calculations here...
return result
Here the output is float64
and inputs are of various types.
As for the usage, I pass single numbers for parameters a-e
and a vector for the last parameter f
. So, it should basically parallelize computations for the elements of that vector.
The problem is, the code works for cpu
and parallel
targets but not for cuda
.
Is there something wrong with the signature? Or the way I pass parameters?
The error is:
File "path-to-my-file.py", line 7, in <module>
@vectorize(['float64(int64,float64,float64,int64,int64,float64)'], target='cuda', nopython=True)
File "C:UsersChantoAppDataLocalContinuumanaconda3libsite-packagesnumbanpyufuncdecorators.py", line 118, in wrap
vec = Vectorize(func, **kws)
File "C:UsersChantoAppDataLocalContinuumanaconda3libsite-packagesnumbanpyufuncdecorators.py", line 38, in __new__
return imp(func, identity=identity, cache=cache, targetoptions=kws)
File "C:UsersChantoAppDataLocalContinuumanaconda3libsite-packagesnumbanpyufuncdeviceufunc.py", line 354, in __init__
assert not targetoptions
AssertionError
python parallel-processing gpu numba
New contributor
add a comment |
up vote
-2
down vote
favorite
I'm using Numba module for Python, namely, the @vectorize
decorator.
The signature is the following:
@vectorize(['float64(int64,float64,float64,int64,int64,float64)'], target='cuda', nopython=True)
def calculate(a, b, c, d, e, f):
# calculations here...
return result
Here the output is float64
and inputs are of various types.
As for the usage, I pass single numbers for parameters a-e
and a vector for the last parameter f
. So, it should basically parallelize computations for the elements of that vector.
The problem is, the code works for cpu
and parallel
targets but not for cuda
.
Is there something wrong with the signature? Or the way I pass parameters?
The error is:
File "path-to-my-file.py", line 7, in <module>
@vectorize(['float64(int64,float64,float64,int64,int64,float64)'], target='cuda', nopython=True)
File "C:UsersChantoAppDataLocalContinuumanaconda3libsite-packagesnumbanpyufuncdecorators.py", line 118, in wrap
vec = Vectorize(func, **kws)
File "C:UsersChantoAppDataLocalContinuumanaconda3libsite-packagesnumbanpyufuncdecorators.py", line 38, in __new__
return imp(func, identity=identity, cache=cache, targetoptions=kws)
File "C:UsersChantoAppDataLocalContinuumanaconda3libsite-packagesnumbanpyufuncdeviceufunc.py", line 354, in __init__
assert not targetoptions
AssertionError
python parallel-processing gpu numba
New contributor
Okay, I see you guys keep downvoting. Can you also make comments about why that is? Otherwise, this is not really helpful.
– Chanto
2 days ago
add a comment |
up vote
-2
down vote
favorite
up vote
-2
down vote
favorite
I'm using Numba module for Python, namely, the @vectorize
decorator.
The signature is the following:
@vectorize(['float64(int64,float64,float64,int64,int64,float64)'], target='cuda', nopython=True)
def calculate(a, b, c, d, e, f):
# calculations here...
return result
Here the output is float64
and inputs are of various types.
As for the usage, I pass single numbers for parameters a-e
and a vector for the last parameter f
. So, it should basically parallelize computations for the elements of that vector.
The problem is, the code works for cpu
and parallel
targets but not for cuda
.
Is there something wrong with the signature? Or the way I pass parameters?
The error is:
File "path-to-my-file.py", line 7, in <module>
@vectorize(['float64(int64,float64,float64,int64,int64,float64)'], target='cuda', nopython=True)
File "C:UsersChantoAppDataLocalContinuumanaconda3libsite-packagesnumbanpyufuncdecorators.py", line 118, in wrap
vec = Vectorize(func, **kws)
File "C:UsersChantoAppDataLocalContinuumanaconda3libsite-packagesnumbanpyufuncdecorators.py", line 38, in __new__
return imp(func, identity=identity, cache=cache, targetoptions=kws)
File "C:UsersChantoAppDataLocalContinuumanaconda3libsite-packagesnumbanpyufuncdeviceufunc.py", line 354, in __init__
assert not targetoptions
AssertionError
python parallel-processing gpu numba
New contributor
I'm using Numba module for Python, namely, the @vectorize
decorator.
The signature is the following:
@vectorize(['float64(int64,float64,float64,int64,int64,float64)'], target='cuda', nopython=True)
def calculate(a, b, c, d, e, f):
# calculations here...
return result
Here the output is float64
and inputs are of various types.
As for the usage, I pass single numbers for parameters a-e
and a vector for the last parameter f
. So, it should basically parallelize computations for the elements of that vector.
The problem is, the code works for cpu
and parallel
targets but not for cuda
.
Is there something wrong with the signature? Or the way I pass parameters?
The error is:
File "path-to-my-file.py", line 7, in <module>
@vectorize(['float64(int64,float64,float64,int64,int64,float64)'], target='cuda', nopython=True)
File "C:UsersChantoAppDataLocalContinuumanaconda3libsite-packagesnumbanpyufuncdecorators.py", line 118, in wrap
vec = Vectorize(func, **kws)
File "C:UsersChantoAppDataLocalContinuumanaconda3libsite-packagesnumbanpyufuncdecorators.py", line 38, in __new__
return imp(func, identity=identity, cache=cache, targetoptions=kws)
File "C:UsersChantoAppDataLocalContinuumanaconda3libsite-packagesnumbanpyufuncdeviceufunc.py", line 354, in __init__
assert not targetoptions
AssertionError
python parallel-processing gpu numba
python parallel-processing gpu numba
New contributor
New contributor
edited 2 days ago
talonmies
58.9k17126192
58.9k17126192
New contributor
asked 2 days ago
Chanto
994
994
New contributor
New contributor
Okay, I see you guys keep downvoting. Can you also make comments about why that is? Otherwise, this is not really helpful.
– Chanto
2 days ago
add a comment |
Okay, I see you guys keep downvoting. Can you also make comments about why that is? Otherwise, this is not really helpful.
– Chanto
2 days ago
Okay, I see you guys keep downvoting. Can you also make comments about why that is? Otherwise, this is not really helpful.
– Chanto
2 days ago
Okay, I see you guys keep downvoting. Can you also make comments about why that is? Otherwise, this is not really helpful.
– Chanto
2 days ago
add a comment |
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Chanto is a new contributor. Be nice, and check out our Code of Conduct.
Chanto is a new contributor. Be nice, and check out our Code of Conduct.
Chanto is a new contributor. Be nice, and check out our Code of Conduct.
Chanto is a new contributor. Be nice, and check out our Code of Conduct.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53402769%2fvectorize-works-for-cpu-and-parallel-but-not-for-cuda%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Okay, I see you guys keep downvoting. Can you also make comments about why that is? Otherwise, this is not really helpful.
– Chanto
2 days ago