相关文章推荐
失眠的烤红薯  ·  python qt textBrowser ...·  4 周前    · 
帅气的领带  ·  【Pyspark ...·  1 周前    · 
近视的橙子  ·  python ...·  1 周前    · 
腼腆的烈马  ·  [Anaconda]——Linux下cond ...·  18 小时前    · 
眉毛粗的电梯  ·  python ...·  2 小时前    · 
安静的手链  ·  使用Stackdriver ...·  1 年前    · 
Collectives™ on Stack Overflow

Find centralized, trusted content and collaborate around the technologies you use most.

Learn more about Collectives

Teams

Q&A for work

Connect and share knowledge within a single location that is structured and easy to search.

Learn more about Teams

RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation: PyTorch error

Ask Question

I am trying to run some code in PyTorch but I got stacked at this point:

At first iteration, both backward operations, for Discriminator and Generator are running well

self.G_loss.backward(retain_graph=True) self.D_loss.backward()

At the second iteration, when self.G_loss.backward(retain_graph=True) executes, I get this error:

RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation: [torch.cuda.FloatTensor [8192, 512]] is at version 2; expected version 1 instead. Hint: the backtrace further above shows the operation that failed to compute its gradient. The variable in question was changed in there or anywhere later. Good luck!

According to torch.autograd.set_detect_anomaly, the last of the following lines in the Discriminator network, is responsible for this:

    bottleneck = bottleneck[:-1]
    self.embedding = x.view(x.size(0), -1)
    self.logit = self.layers[-1](self.embedding)

The strange thing is that I have used that network architecture in other code where it worked properly. Any suggestions?

The full error:

    site-packages\torch\autograd\__init__.py", line 127, in backward
    allow_unreachable=True)  # allow_unreachable flag
RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation: [torch.cuda.FloatTensor [8192, 512]] is at version 2; expected version 1 instead. Hint: the backtrace further above shows the operation that failed to compute its gradient. The variable in question was changed in there or anywhere later. Good luck!
        

Thanks for contributing an answer to Stack Overflow!

  • Please be sure to answer the question. Provide details and share your research!

But avoid

  • Asking for help, clarification, or responding to other answers.
  • Making statements based on opinion; back them up with references or personal experience.

To learn more, see our tips on writing great answers.