Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Graph disconected: transfer-learning model with top-layer #63

Open
KENJI-JIN opened this issue Jun 24, 2021 · 15 comments
Open

Graph disconected: transfer-learning model with top-layer #63

KENJI-JIN opened this issue Jun 24, 2021 · 15 comments

Comments

@KENJI-JIN
Copy link

KENJI-JIN commented Jun 24, 2021

I built the transfer-learning model below.
top_layer is used to resize images for utilizing more information of images .

conv_base = MobileNetV3Small(weights='imagenet',
include_top=False,
input_shape=(224, 224, 3))

input_width=450
input_height=450
inputs = tf.keras.Input(shape=(input_height, input_width, 3))

top_layer = layers.Conv2D(filters=3, kernel_size=3, strides=1)(inputs)
top_layer = layers.MaxPool2D(pool_size=(2, 2))(top_layer)

x = conv_base(top_layer, training=False)
x = layers.GlobalAveragePooling2D()(x)
#x = layers.Dropout(0.2)(x)

outputs = layers.Dense(1)(x)
outputs = layers.Activation("sigmoid")(outputs)

model = Model(inputs, outputs)

Then when I try to call to GradCam I get:
ValueError: Graph disconnected:

I reffered this site. But I am using top_layer, so I could not do same way.
https://stackoverflow.com/questions/60623869/gradcam-with-guided-backprop-for-transfer-learning-in-tensorflow-2-0

How can I solve this? Could you give me any idea?

@keisen
Copy link
Owner

keisen commented Jun 24, 2021

Hi, @KENJI-JIN .

At least, as far as I know, we can't avoid the error when we apply Gradcam to a model that contains a sub-model as a layer.
( If you construct the model that has the same layers as the sub-model and then copy the weights of the layers of the sub-model to the layers of the model you constructed, you might achieve it. But I don't know whether it works well. )

So I recommend you to adopt ScoreCAM that is the gradient-free method.
For details, please see the notebook below:

https://github.com/keisen/tf-keras-vis/blob/master/examples/attentions.ipynb

@KENJI-JIN
Copy link
Author

Thank you for your response. I will try to adopt ScoreCAM.

@keisen
Copy link
Owner

keisen commented Jun 24, 2021

@KENJI-JIN , we found some problem in ScoreCAM, so we recommend you to temporally install and use the commit below:

pip install -U git+https://github.com/keisen/tf-keras-vis.git@4a90becb02ed3d44825300fcb807dd58157787ba

I'm sorry to have troubled you.
The next release which includes the commit above will be published in few days.

Thanks!

@KENJI-JIN
Copy link
Author

Thank you for your information. I will upgrade it.

@amessalas
Copy link

Hi @keisen

Regarding these:

So I recommend you to adopt ScoreCAM that is the gradient-free method.
For details, please see the notebook below:

https://github.com/keisen/tf-keras-vis/blob/master/examples/attentions.ipynb

@KENJI-JIN , we found some problem in ScoreCAM, so we recommend you to temporally install and use the commit below:

I applied ScoreCAM to a model that contained a sub-model as layer, but I still got the same error ValueError: Graph disconnected. I am using the latest version 0.7.2.

Here's a minimal, reproducible example:

# ScoreCAM issue
# Version 0.7.2
import numpy as np
from tf_keras_vis.scorecam import ScoreCAM
from tf_keras_vis.utils.model_modifiers import ReplaceToLinear
from tf_keras_vis.utils.scores import CategoricalScore

from tensorflow.keras.applications.vgg16 import VGG16
from tensorflow.keras.layers import Dense, GlobalAveragePooling2D, Input
from tensorflow.keras.models import Model
from tensorflow.keras.datasets import cifar10
from tensorflow.keras.preprocessing.image import ImageDataGenerator
from tensorflow.keras.utils import to_categorical


def get_model():
    base_model = VGG16(
        weights="imagenet", include_top=False, input_shape=(32, 32, 3)
    )
    base_model.trainable = False
    inputs = Input(shape=(32, 32, 3))
    x = base_model(inputs, training=False)
    x = GlobalAveragePooling2D()(x)
    outputs = Dense(10, activation='softmax')(x)
    model = Model(inputs, outputs)
    return model


if __name__ == '__main__':
    (x_train, y_train), (x_test, y_test) = cifar10.load_data()
    y_train, y_test = to_categorical(y_train), to_categorical(y_test)
    generator = ImageDataGenerator(rescale=1/255.)
    data_gen = generator.flow(x_train, y_train, batch_size=64)

    model = get_model()
    model.compile(
        loss="categorical_crossentropy", optimizer="adam", metrics=["accuracy"]
    )
    history = model.fit(
        data_gen,
        epochs=2  # just for the sake of this example
    )

    img, label = x_test[1]/255, np.argmax(y_test[1])

    scorecam = ScoreCAM(model, model_modifier=ReplaceToLinear(), clone=True)
    sc = scorecam(CategoricalScore(label), img, penultimate_layer=-1)
    # the last line, raises:
    # ValueError: Graph disconnected: cannot obtain value for tensor KerasTensor
    # (type_spec=TensorSpec(shape=(None, 32, 32, 3), dtype=tf.float32, name='input_3'),
    # name='input_3', description="created by layer 'input_3'") at layer "block1_conv1".
    # The following previous layers were accessed without issue: []

Do you have any idea what could be wrong? Thanks in advance

@keisen
Copy link
Owner

keisen commented Aug 3, 2021

@amessalas , you can avoid the error by improving get_model() function to as follows

def get_model():
    base_model = VGG16(
        weights="imagenet", include_top=False, input_shape=(32, 32, 3)
    )
    base_model.trainable = False
    x = base_model.output
    x = GlobalAveragePooling2D()(x)
    outputs = Dense(10, activation='softmax')(x)
    model = Model(base_model.inputs, outputs)
    return model

Thanks!

@keisen
Copy link
Owner

keisen commented Aug 3, 2021

Hi @KENJI-JIN , have you resolved this issue?
If not yet, please feel free to ask us anything you're facing.

Thanks!

@KENJI-JIN
Copy link
Author

KENJI-JIN commented Aug 4, 2021

Thank you for contacting me.

I haven't resolved this issue yet.
I avoided it by creating a model from scratch without using a pre-trained model.
When I use a pre-trained model with "include_top=False", I will refer to the above comments.
If I have the same trouble again, I will ask in this issue.

Thank you in advance.

@keisen
Copy link
Owner

keisen commented Aug 4, 2021

@KENJI-JIN , I'm happy to hear that you were able to avoid the error.

When I use a pre-trained model with "include_top=False", I will refer to the above comments.
If I have the same trouble again, I will ask in this issue.

I see. I'm looking forward to good news!

Finally, if you like tf-keras-vis, please star this repository.
Thanks!

@svaningelgem
Copy link

I had exactly the same issue and found a solution for it. (watch out! i'm a beginner at this)

Maybe it's easy to integrate into this library as the method should be generic enough?
Added it in here: https://medium.vaningelgem.be/how-to-flatten-a-keras-model-68b8716e296c

@rEstela
Copy link

rEstela commented Feb 7, 2022

Hi @keisen !

I have a similar problem here, don't know if you could help me.

This is the model i need to use grad-cam:
This is a binary classification problem.


Model: "functional_1"


Layer (type) Output Shape Param # Connected to


input_1 (InputLayer) [(None, 141, 898, 1 0 []
)]

input_2 (InputLayer) [(None, 12, 144, 22 0 []
4, 1)]

sequential (Sequential) (None, 64) 26448 ['input_1[0][0]']

sequential_1 (Sequential) (None, 96) 77424 ['input_2[0][0]']

concatenate (Concatenate) (None, 160) 0 ['sequential[0][0]',
'sequential_1[0][0]']

dense (Dense) (None, 16) 2576 ['concatenate[0][0]']

dense_1 (Dense) (None, 1) 17 ['dense[0][0]']

==================================================================================================
Total params: 106,465
Trainable params: 105,697
Non-trainable params: 768


When i run Grad-Cam, i got this:

ValueError: Graph disconnected: cannot obtain value for tensor KerasTensor(type_spec=TensorSpec(shape=(None, 12, 144, 224, 1), dtype=tf.float32, name='conv3d_input'), name='conv3d_input', description="created by layer 'conv3d_input'") at layer "conv3d". The following previous layers were accessed without issue: []

Running ScoreCam i got the same answer...

@ProfessorHT
Copy link

Hello,

Does Anyone find a solution for this problem?

Thank you

@tguttenb
Copy link

Hi, I am also trying to apply functions from this package on a transfer learning problem and am getting a similar issue using Gradcam, below. Is there any update or ideas for work around on these features? Any more information about fixes or issues on this type of would be helpful.

File "C:\Users\paperspace\anaconda3\lib\site-packages\tf_keras_vis\gradcam.py", line 105, in call
cam = self._calculate_cam(grads, penultimate_output, gradient_modifier,

File "C:\Users\paperspace\anaconda3\lib\site-packages\tf_keras_vis\gradcam.py", line 124, in _calculate_cam
weights = K.mean(grads, axis=tuple(range(grads.ndim)[1:-1]), keepdims=True)

AttributeError: 'NoneType' object has no attribute 'ndim'

Thanks,
Tom

@IshaanShettigar
Copy link

I am facing the same error, below is my model.summary()
image

Here is the error, can anyone help

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
[<ipython-input-53-8d8e7b328507>](https://localhost:8080/#) in <cell line: 4>()
      2 
      3 # generate heatmap with gradcam
----> 4 cam = gradcam(score, input_images, penultimate_layer= -1)
      5 
      6 f, ax = plt.subplots(nrows=1, ncols=2, figsize=(12,4))

6 frames
[/usr/local/lib/python3.10/dist-packages/keras/engine/functional.py](https://localhost:8080/#) in _map_graph_network(inputs, outputs)
   1140                 for x in tf.nest.flatten(node.keras_inputs):
   1141                     if id(x) not in computable_tensors:
-> 1142                         raise ValueError(
   1143                             "Graph disconnected: cannot obtain value for "
   1144                             f'tensor {x} at layer "{layer.name}". '

ValueError: Graph disconnected: cannot obtain value for tensor KerasTensor(type_spec=TensorSpec(shape=(None, 224, 224, 3), dtype=tf.float32, name='input_8'), name='input_8', description="created by layer 'input_8'") at layer "block1_conv1". The following previous layers were accessed without issue: []

@abdelrazzaqabu
Copy link

faced the same problem :(
hope someone finds a solution to this soon

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

9 participants