Skip to content

Commit

Permalink
Minor bug fixes
Browse files Browse the repository at this point in the history
  • Loading branch information
Roman Joeres authored and Roman Joeres committed Sep 4, 2024
1 parent 8d35ee9 commit ab28865
Show file tree
Hide file tree
Showing 2 changed files with 10 additions and 10 deletions.
18 changes: 9 additions & 9 deletions configs/downstream/all.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -25,20 +25,20 @@ datasets:
task: multilabel
pre-transforms:
model:
- name: rf
n_estimators: 500
n_jobs: -1
random_state: 42
- name: svm
random_state: 42
- name: xgb
random_state: 42
# - name: rf
# n_estimators: 500
# n_jobs: -1
# random_state: 42
#- name: svm
# random_state: 42
#- name: xgb
# random_state: 42
- name: mlp
feat_dim: 1024
hidden_dim: 1024
batch_size: 256
num_layers: 3
epochs: 1
epochs: 100
patience: 30
learning_rate: 0
optimizer: Adam
Expand Down
2 changes: 1 addition & 1 deletion gifflar/model/baselines/rgcn.py
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ def __init__(
**kwargs)

self.convs = torch.nn.ModuleList()
dims = [kwargs["feat_dim"], hidden_dim // 2] + [hidden_dim] * (num_layers - 1)
dims = [feat_dim, hidden_dim // 2] + [hidden_dim] * (num_layers - 1)
for i in range(num_layers):
convs = {
# Set the inner layers to be a single weight without using the nodes embedding (therefore, e=-1)
Expand Down

0 comments on commit ab28865

Please sign in to comment.