Conversation
add serialize and deserialize method for dpa1 descriptor
wanghan-iapcm
left a comment
There was a problem hiding this comment.
Overall I like the PR very much. I am keen to see the numpy implementation is simultaneously PRed to the deepmd-kit repo
| yy: torch.Tensor | ||
| The output. | ||
| """ | ||
| yy = F.embedding(xx, self.matrix) |
There was a problem hiding this comment.
What is the operation of torch.nn.functional.embedding? how would you implement it in numpy?
| import numpy as np | ||
| import torch | ||
| import torch.nn as nn | ||
| import torch.nn.functional as F |
There was a problem hiding this comment.
very bad short-hand name. change required.
| return obj | ||
|
|
||
|
|
||
| class EmbdLayer(nn.Module): |
There was a problem hiding this comment.
should be implemented via native layer?
| yy: torch.Tensor | ||
| The output. | ||
| """ | ||
| yy = F.layer_norm(xx, tuple((self.num_in,)), self.matrix, self.bias, self.eps) |
There was a problem hiding this comment.
why layer_norm contains trainable parameters?
how would you implement it in numpy? please provide the implementation to deepmd-kit repo
| dtype = env.GLOBAL_PT_FLOAT_PRECISION | ||
|
|
||
|
|
||
| class TestCaseSingleFrameWithNlist(): |
There was a problem hiding this comment.
better to import from test_se_e2_a.py
| # pre-allocate a shape to pass jit | ||
| xyz_scatter = torch.zeros([nfnl, 4, self.filter_neuron[-1]], dtype=self.prec, device=env.DEVICE) | ||
| for ii,ll in enumerate(self.filter_layers.networks): | ||
| for ii,ll in enumerate(self.filter_layers._networks): |
There was a problem hiding this comment.
better providing a method to access the data of a class object.
This PR aims to reformat the dpa1 descriptor in both pytorch and tensorflow repo.