Hello World with TensorFlow.js

Sam - Jan 4 '22 - - Dev Community

Today we have all heard a lot about AI and seen many applications of it, and we probably have ideas that can be implemented with the help of AI. But while AI may seem very complicated at first, getting started with it is not that difficult.

With the help of tools like TensorFlow, we can create interesting applications without even knowing any of the theories behind AI.

tensorflow-js-logo-social.png

One of the advantages of TensorFlow is that it can be used even in a browser. Interestingly, not only can trained models be used, but the model can be trained in the browser itself.

In this tutorial I'm gonna show you how to train a simple model with TensorFlow.js to solve a linear equation, y=10x+4.

It looks simple, but don't forget this is a hello world example ;). I'll cover more advanced scenarios in other next posts.

Let's create a simple html file and add the js file of TensorFlow.js to the head of our html page.

<html>
<head>
 <!-- Load TensorFlow.js -->
 <!-- Check https://github.com/tensorflow/tfjs for the latest version -->
 <script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs@3.12.0"> </script>
<body></body>
</html>
Enter fullscreen mode Exit fullscreen mode

To solve an AI problem there are plenty of steps to be taken and many factors to take into consideration, but one of the very basic ones is how to model the problem space. In simple terms, what do you expect to give to your model and what output do you expect from it. In this example we expect to give the model a number and get a number as the result which should be the answer to the equation y=10x+4.

Let's do a free fall to elementary school and put together a table with few samples:

Screen Shot 2021-12-24 at 6.26.40 pm.png

This is a sample data we can actually use to train our model and expect it to give us the result for say x=6

Did I mention you don't need to know anything about the underlying theory? Well, maybe that was not 100% accurate, but trust me there is a big gap between what you need to know to start using TensorFlow and having an in depth knowledge of the underlying theories.
For now, just accept that we can use a simple single node Neural Network with the most basic parameters.

We create the model like this:

const model = tf.sequential();
Enter fullscreen mode Exit fullscreen mode

You can find more details about tf.sequential here .

Now we can specify that it needs only one node in our sequential mode:

model.add(tf.layers.dense({units: 1, inputShape: [1]}));
Enter fullscreen mode Exit fullscreen mode

Now we have to prepare the model for training and evaluation with compile method:

model.compile({optimizer: 'sgd', loss: 'meanSquaredError'});
Enter fullscreen mode Exit fullscreen mode

I have used pretty basic settings here, but if you want to read more about the compile method and it's parameters you can check this link . I'll go through these details in more depth in the upcoming tutorials.

Now let's represent our table in a format that's understandable to our model. We use tensor2d method to do this. We want to train the model with 5 inputs and outputs represented as 5 by 1 arrays:

const xs = tf.tensor2d([1, 2, 3, 4, 5], [5, 1]);
const ys = tf.tensor2d([14, 24, 34, 44, 54], [5, 1]);
Enter fullscreen mode Exit fullscreen mode

And we use the fit method to train the model. Let's set the iterations (epochs) to 400.

await model.fit(xs, ys, {epochs: 400});
Enter fullscreen mode Exit fullscreen mode

Finally, we use our trained model to predict the result for x=6.
Here I simply use the predict method set the result as the innerText of my <div>

document.getElementById('result').innerText = model.predict(tf.tensor2d([6], [1, 1]));
Enter fullscreen mode Exit fullscreen mode

Notice that again we are using a tensor, tensor2d to be specific, to feed data to our model.This is the complete code for what we've done.

<html>
  <head>
    <!-- Check https://github.com/tensorflow/tfjs for the latest version -->
    <script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs@3.12.0"></script>
  </head>
  <body>
    <h1>Hello TensorFlow.js</h1>
    Result:
    <div id="result"></div>
    <script>
      async function solve() {
        const model = tf.sequential();
        model.add(tf.layers.dense({ units: 1, inputShape: [1] }));
        model.compile({ optimizer: "sgd", loss: "meanSquaredError" });
        const xs = tf.tensor2d([1, 2, 3, 4, 5], [5, 1]);
        const ys = tf.tensor2d([14, 24, 34, 44, 54], [5, 1]);
        await model.fit(xs, ys, { epochs: 400 });
        document.getElementById("result").innerText = model.predict(
          tf.tensor2d([6], [1, 1])
        );
      }
      solve();
    </script>
  </body>
  <html></html>
</html>
Enter fullscreen mode Exit fullscreen mode

And this is the result, pretty accurate!!!

Screen Shot 2021-12-23 at 10.36.02 pm.png

Now that we have our code ready, let's deploy it on utopiops.

Head over to the Fully managed applications section as we want to use free static website deployment and hosting that Utopiops offers.

Screen Shot 2021-12-31 at 8.16.15 pm.png

Now we choose Static website as the application type to be created. (Utopiops also offers free plans for Function and Dockerized applications)

Screen Shot 2021-12-31 at 8.16.35 pm.png

Now the only thing we need to know is to specify the repository that we store our code (Utopiops supports Github, Bitbucket and Gitlab).

Screen Shot 2021-12-31 at 8.18.29 pm.png

And that's it, in a few seconds we have our website ready and every time we make a change to our code it automatically deploys our changes.

https://tensorflowjs-hello-world-8d21f585.sites.utopiops.com/

Note: Utopiops is in public beta at the time of writing this post and the view you see when you log in to Utopiops at https://www.utopiops.com might be different, but the good news is that it sure just have become more user-friendly and easier to use.

This post was originally published here .

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .