bdy9527 / fagcn Goto Github PK
View Code? Open in Web Editor NEWLicense: Apache License 2.0
License: Apache License 2.0
I think this work is very important for our research now. But it will be very nice of you to upload your codes for Fig. 1(a), which should be easy to reproduce.
Thanks for your reply.
Hi! Your work is of great help to my research.
Now I am reproducing Fig. 1(a), but I found that there are only test codes for the low-frequency and high-frequency signals of Fig. 1(a) in file “synthetic.py” and no test codes for the FAGCN model. I have tried to write it by myself but the effect is poor.
Could you please help me to reproduce it or supplement the file.
Waiting eagerly for your answer.
Hi, bdy,
I wonder if self.g.apply_edges(self.edge_applying)
needs a argument to point out what edges
it is as the function written below:
def edge_applying(self, edges):
h2 = torch.cat([edges.dst['h'], edges.src['h']], dim=1) # g.srcnodes, g.srcdata['h']
g = torch.tanh(self.gate(h2)).squeeze() # delete the redundant dimension
e = g * edges.dst['d'] * edges.src['d']
e = self.dropout(e)
return {'e': e, 'm': g}
When I run g.apply_edges(self.edge_applying)
independently in pytorch 1.8.1, the error edge_applying() missing 1 required positional argument: 'edges'
will appear.
When I use Lambda function in the g.apply_edges as:
g.apply_edges(lambda edges: {'m': torch.tanh(gate(torch.cat([edges.dst['h'], edges.src['h']], dim=1))).squeeze()})
There is no error comes up.
Why the function defined outside, like edge_applying(edges)
, will lead to an error, while the inner function can avoid it?
Looking for your kind reply!
Wu Shiauthie
Hi, Bo Deyu,
I am a PH.D. of NUDT. Rencently I have read your article FAGCN, it's really inspiring! Wonderful work! However, there is a problem that bothers me. Why do we need to reinit parameters as nn.init.xavier_normal_? What is the impact of using the default truncted norm initialization? Hope to hear from you!
Hello, and thank you for your excellent work. I would like to ask whether you can open source the code of Figure 6 in the paper? If you can thank me very much, it means a lot to me.
About the negative amplitude ,how to understand. Why,have to avoid this.
Please explain. Thanks
Great work!
We are reproducing the statistical results in Table 2 and want to know how to design the 10 runs mentioned in the article for FAGCN. Use random seeds [0,1,2,...,9] and fixed hyperparameters in Sec 5.2?
We have re-run the code with random seed=0,1,2,3 on the Cora dataset and use the same hyperparameters mentioned in Sec 5.2, but the test accuracy only get 0.803(0), 0.806(1), 0.809(2), 0.798(3) respectively.
The torch we used is 1.5.1 which is same as mentioned in the "requirement.txt".
Do I miss something? Could you please help me to reproduce the results?
Thanks!
Related code:
# Hyperparameters setting
parser = argparse.ArgumentParser()
parser.add_argument('--dataset', default='cora') # new_squirrel, syn0
parser.add_argument('--lr', type=float, default=0.01, help='Initial learning rate.')
parser.add_argument('--weight_decay', type=float, default=1e-3, help='Weight decay (L2 loss on parameters).')
parser.add_argument('--epochs', type=int, default=500, help='Number of epochs to train.')
parser.add_argument('--hidden', type=int, default=32, help='Number of hidden units.')
parser.add_argument('--dropout', type=float, default=0.6, help='Dropout rate (1 - keep probability).')
parser.add_argument('--eps', type=float, default=0.2, help='Fixed scalar or learnable weight.')
parser.add_argument('--layer_num', type=int, default=4, help='Number of layers')
parser.add_argument('--train_ratio', type=float, default=0.6, help='Ratio of training set')
parser.add_argument('--patience', type=int, default=100, help='Patience')
args = parser.parse_args()
# Reset random seed
seed = 0 # 0,1,2,3
np.random.seed(seed)
torch.manual_seed(seed)
torch.cuda.manual_seed(seed)
# Load dataset
data = dgl.data.CoraDataset()
edge = data.graph.edges
feat = data.features
labels = data.labels
index = np.arange(feat.shape[0])
train = index[data.train_mask.astype(bool)]
val = index[data.val_mask.astype(bool)]
test = index[data.test_mask.astype(bool)]
...
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.