How do I resolve this error "ValueError: The number of parallel adapters and the number of active heads must match."? #701
-
** I am able to do inference with both of my adapter models separately, without issues. I am not sure where the issue may be, I can provide additional details if you need. Based on what you showed, this code should be able to work. from transformers import AutoTokenizer
from adapters import AutoAdapterModel
#Defining the configuration for the model
tokenizer = AutoTokenizer.from_pretrained("FacebookAI/xlm-roberta-large")
#Setting up the model
model = AutoAdapterModel.from_pretrained(
"FacebookAI/xlm-roberta-large"
)
ph_adapters = model.load_adapter("/path1")
NER_adapter = model.load_adapter("/path2)
import adapters.composition as ac
model.set_active_adapters(ac.Parallel(NER_adapter, ph_adapters))
model.active_heads = ac.Parallel("ner_head", "ph_head")
import torch
def analyze_sentence(sentence):
tokens = tokenizer.tokenize(sentence)
input_ids = torch.tensor(tokenizer.convert_tokens_to_ids(tokens))
outputs = model(input_ids)
# Post-process NER output
ner_labels_map = model.get_labels_dict(NER_adapter)
ner_label_ids = torch.argmax(outputs[0].logits, dim=2).numpy().squeeze().tolist()
ner_labels = [ner_labels_map[id_] for id_ in ner_label_ids]
annotated = []
for token, label_id in zip(tokens, ner_label_ids):
token = token.replace('\u0120', '')
label = ner_labels_map[label_id]
annotated.append(f"{token}<{label}>")
print("NER: " + " ".join(annotated))
# Post-process sentiment output
sentiment_labels = model.get_labels_dict(ph_adapters)
label_id = torch.argmax(outputs[1].logits).item()
print("PH: " + sentiment_labels[label_id])
print()
sentences = [
"I'm Emma and I have a chest pain and it's been making it hard to breathe. I think I might have pneumonia.",
"John Smith's Address is: 14 HAMLET Ct, CHATHAM CANADA.",
"Ben Emma ve göğüs ağrım var ve nefes almamı zorlaştırıyor. Zatürre olabileceğimi düşünüyorum."
]
for sentence in sentences:
analyze_sentence(sentence) |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
Hey @charliezuo, the error message you mentioned might occur if not all loaded adapters have corresponding heads added to the model correctly. Usually, this should automatically happen with assert NER_adapter in model.heads
assert ph_adapters in model.heads There's one minor error in your code: it should be model.active_head = ac.Parallel(NER_adapter, ph_adapters) This line is also not necessary as Hope this helps! |
Beta Was this translation helpful? Give feedback.
Hey @charliezuo, the error message you mentioned might occur if not all loaded adapters have corresponding heads added to the model correctly. Usually, this should automatically happen with
load_adapter()
if a head is available in the checkpoint. You can check if this is the case like this after loading the adapters:There's one minor error in your code: it should be
active_head
instead ofactive_heads
to explicitly activate loaded heads, i.e.:This line is also not necessary as
set_active_adapters()
will automatically also activate the heads if available.Hope thi…