-
Notifications
You must be signed in to change notification settings - Fork 21.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
torch.optim.LBFGS isuue #126625
Comments
GPT4:
Let me explain. def net_output(self, x, y):
# Implementation based on self.forward
xy = torch.cat([x, y], dim=1) # Concatenate the input data
uv_pred = self.model(xy) # Perform prediction using the modified MLP
p = uv_pred[:,1:2]
'''
omit other irrelevant code
'''
view = p.grad.view(-1) As we can see, the If tensor([[1, 2, 3],
[4, 5, 6],
[7, 8, 9]]) Then the storage of
Because tensor([[2],
[5],
[8]]) Then the storage of
So, >>> p.is_contiguous()
False Recommend the blog about Pytorch internals: http://blog.ezyang.com/2019/05/pytorch-internals/ |
I am going to close this, @douyipu provided a good description of the data layout of tensors. And specifically on what can or can't be a view. If you have further questions I would suggest moving this convo over to dev-discuss: |
🐛 Describe the bug
I apologize for not including all of the code, but the main content of the code is as follows.
Versions
I am using PyTorch version '2.2.0', and I encountered the following error:
When I replaced
.view(-1)
with.reshape(-1)
in the code, it worked without any issues.Thank you for reading.
The text was updated successfully, but these errors were encountered: