Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[tfjs-models/deeplab] why do we not have a save method in class SemanticSegmentation (to save the model in localstorage/indexdb) #8486

Open
cu8code opened this issue Dec 27, 2024 · 0 comments
Labels
type:feature New feature or request

Comments

@cu8code
Copy link

cu8code commented Dec 27, 2024

System Information

  • TensorFlow.js version: 4.22.0
  • Are you willing to contribute: Yes

Feature Request Description
I propose adding a save method to the SemanticSegmentation here class. Currently, I use a workaround where I save the model to IndexedDB after loading it. Here's my implementation:

let graphModel = null;
try {
    console.log("loading from memory");
    graphModel = await tfjs.loadGraphModel('indexeddb://deep');
    console.log("loaded from memory");
} catch (e) {
    console.log("failed to load from memory, loading from network");
    graphModel = await tfconv.loadGraphModel(
        modelConfig.modelUrl ||
        getURL(modelConfig.base!, modelConfig.quantizationBytes!)
    );
    console.log("saving to memory");
    graphModel.save("indexeddb://deep");
}

Impact on Current API
This addition would simplify workflows for browser-based applications.

Target Audience
Users running TensorFlow.js in the browser.

@cu8code cu8code added the type:feature New feature or request label Dec 27, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type:feature New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant