Skip to content Skip to sidebar Skip to footer

Does Tf.transpose Also Change The Memory (like Np.ascontiguousarray)?

If I use tf.transpose does it also change the memory layout? In numpy it is used the function np.ascontiguousarray. I mean this would be important if I use cuda. Because it makes

Solution 1:

If you read carefully the documentation you can find the answer:

Numpy Compatibility

In numpy transposes are memory-efficient constant time operations as they simply return a new view of the same data with adjusted strides.

TensorFlow does not support strides, so transpose returns a new tensor with the items permuted.

Hence tf.transpose returns a new tensor with the desired shape (and therefore is inefficient), so yes, it changes the memory layout.

However, instead of using tf.trasnpose you could use tf.reshape for changing the tensor shape without creating a new one

Post a Comment for "Does Tf.transpose Also Change The Memory (like Np.ascontiguousarray)?"