- Create a directory and open terminal in that directory
- Run
npm init
- Choose defaults
- Now Install tensorflow.
npm i @tensorflow/tfjs
- In order to execute code in browser you need a bundler as well.(Parcel is recommended for that purpose). Run
npm install -g parcel-bundler --save-dev
This will include parcel as a dev dependency
- Now Add the following in your scripts section in package.json
"dev": "parcel <your entry file>",
"build": "parcel build <your entry file>"
You can check if your bowser supports WebGL by going on WebGL and checking if the cube spins.
By default, if your browser supports WEBGL, then you can check the backend using following code.
console.log(tf.version)
tf.ready().then(() => {
console.log(tf.getBackend());
});
WebGL is a JavaScript API for rendering interactive 2D and 3D graphics within any compatible web browser without the use of plug-ins. WebGL is fully integrated with other web standards, allowing GPU-accelerated usage of physics and image processing and effects as part of the web page canvas.
In order to set wasm backend i.e WebAssembly, then you need to follow following steps
- Run following command in cmd
npm i @tensorflow/tfjs-backend-wasm
- Import it in index.js file
import '@tensorflow/tfjs-backend-wasm';
- Now set the backend as wasm.
tf.setBackend('wasm');
- Now, you need a parcel plugin to copy the staic files inside its node_modules directory to the dist folder of application
npm i -D parcel-plugin-static-files-copy
This will include static-files-copy as your dev dependency
- Now you need to add the following code in package.json file
"staticFiles": {
"staticPath": "./node_modules/@tensorflow/tfjs-backend-wasm/dist",
"excludeGlob": ["**/!(*.wasm)"]
},
WASM is good option if you want too target low end devices with no GPU and less processing power.
If your use case demands running your models on server side, then you can use nodeJS for that.
Run the following command on cmd
npm i @tensorflow/tfjs-node
Now that we have server side bindings for tensorflow, we can test it out.
- Create a file server.js
- Paste following codeπ₯
const tf = require('@tensorflow/tfjs');
require('@tensorflow/tfjs-node');
console.log(tf.version);
tf.ready().then(() => {
console.log(tf.getBackend());
});
In windows, you will run into errors as it will ask for for dependencies, However in Mac Big Sur it worked perfectly fine.
NodeJS can also be used to make API endpoints as well. The most popular framework for this use case is ExpressJS.
If you want to expose your tensorflowJS code as API, either for training or for making predictions, you can use ExpressJS.
npm i express
Now you can make an app using following codeπ₯
const express = require('express');
const app = express();
Create an endpoint and call it "train"
app.get('/train', function(req, res){
// demo code < you can include training code here>
console.log(tf.version);
tf.ready().then(()=>{
const msg = `Loaded Tensorflow.js-version is ${tf.version} with backend ${tf.getBackend()}`;
console.log(msg);
res.send(msg);
})
app.listen(9000,function(req,res){
console.log('Running server on port 9000 ...');
})
});
Now go to url localhost:9000/train
And you will see the response from the api on the browser webpage.