Skip to content Skip to sidebar Skip to footer

Error Message In Python With Differentiation

I am computing these derivatives using the Montecarlo approach for a generic call option. I am interested in this combined derivative (with respect to both S and Sigma). Doing this

Solution 1:

As the error message indicates, gradients can only be computed for functions that return a scalar. Your function returns a vector:

print(len(second_derivative_mc(1.1, 0.5)))
# 100

For vector-valued functions, you can compute the jacobian (which is similar to a multi-dimensional gradient). Is this what you had in mind?

from jax import jacobian
greek = vmap(jacobian(jacobian(second_derivative_mc, argnums=1), argnums=0))(Underlying_asset,volatilities)

Also, this is not what you asked about, but the function above will probably not work as you intend even if you solve the issue in the question. Numpy RandomState objects are stateful, and thus will generally not work correctly with jax transforms like grad, jit, vmap, etc., which require side-effect-free code (see Stateful Computations In JAX). You might try using jax.random instead; see JAX: Random Numbers for more information.

Post a Comment for "Error Message In Python With Differentiation"