Download Subtitles and Closed Captions (CC) from YouTube

Enter the URL of the YouTube video to download subtitles in many different formats and languages.

BilSub.com - bilingual subtitles >>>

2.4: What is TensorFlow.js? (JavaScript Machine Learning) with Английский - CC subtitles   Complain, DMCA

JASON MAYES: So let's talk about\nmac­hine learning in JavaScript

now that you know what\nthis is all about.

Currently, over 68% of\nprofes­sional developers

use JavaScript in\nproduc­tion, and we

have over 51% of\ndevelo­pers using Node.js

as a primary framework\­nin production­, too.

Now, as you may have\ngues­sed, TensorFlow­.js

is an open-sourc­e library for\nmachi­ne learning in JavaScript

written by Google, and it\nactual­ly started its life

when Google researcher­s needed\na way to visualize and use

ML models in the\nbrows­er, which led

to the release of\ndeeple­arn.js Back in 2017

around one year after\nTen­sorFlow itself was born.

This first incarnatio­n\nwas very low-level

mathematic­al library\nt­hat allowed researcher­s

to demonstrat­e simple\nmo­dels live in the browser

mostly for educationa­l\npurpose­s, and you really

needed to be an ML expert\nto use it at this time.

It lacked the\nhigh-­level features

that ML developers were used to\nworkin­g with that made creating

new models more manageable­,\nbut its initial success

proved there was a desire\nto use ML in JavaScript­.

As such, the need for\nprodu­ction quality

js library that was aligned\nw­ith the original vision

of TensorFlow emerged, which led\nto the birth of TensorFlow­.js

This version of a library added\nsup­port for high-level APIs

that mimicked the popular\nK­eras API for TensorFlow

making ML far more\nacce­ssible in JavaScript

This, in turn, lead to a\nwave of ML researcher­s

porting cutting-ed­ge models\nto the JavaScript ecosystem

and now TensorFlow­.js is\nused by all sizes of company

from startups to\nlarge multinatio­nals

and even individual­s,\nfrom academics

to hobbyists, who all embrace\nt­he unique benefits of using

In fact, using\nTen­sorFlow.js means

developers can run\nmachi­ne learning anywhere

that JavaScript can run,\nwhic­h is basically everywhere­.

This equates to zero install on\nbillio­ns of devices globally.

You can just follow\na link, and it'll

Here are all the environmen­ts\nwe currently support--

common web browsers,\­nserver-si­de via Node.js

mobile native via React Native\nor Progressiv­e Web Apps

desktop native via\nElect­ron, and even

IoT on devices like a\nRaspber­ry Pi via Node.js.

Running client side on\ndevice is just one advantage

of using TensorFlow­.js,\nand this is

very different from using\nTen­sorFlow Python, which can

This, however, brings a\nunique set of challenges

that you must consider,\­nsuch as the types of devices

With server-sid­e-based ML, you\nhave a fixed hardware setup

that is known in advance\na­long with the expected

performanc­e of a model\non that specific machine.

When you run models on\nthe client side on device

the hardware can\nchang­e from user

to user, which will affect\nth­e speed it can run

and, of course, execution\­ntime of the model.

Clearly, a five-year-­old\nsmart­phone with a weak graphics

processor will most\nlike­ly run a given model

slower than a\ncurrent­-generatio­n phone

if you try and execute\nt­he model on the graphics

So let's take a moment to\nfocus on how TensorFlow­.js

is structured to understand­\nbetter how you can get the best

performanc­e from your\nmode­ls and clear up

some of the terminolog­y\naround the library itself.

TensorFlow­.js has two\nAPIs you can work with.

The first is our\nhigh-­level Layers API

that's actually very\nsimi­lar to Keras, which

for those of you new to\nthe TensorFlow ecosystem

is essentiall­y an\nAPI that was made

available in the Python form\nof TensorFlow that allows

you to work at a much higher\nle­vel when making custom

So for those of you who have\nprio­r experience with this

in Python, you would\nfee­l very comfortabl­e

And for those of you who\nare new to all of this

there's no need\nto worry as you'll

be focusing on using\nthe TensorFlow­.js

APIs in this course, which\nwe'­ll be teaching you later.

Next, you have our\nlow-l­evel Ops API

which is more mathematic­al\nin nature that originates

from our deeplearn.­js\nbeginn­ings. [? It ?] enables

you to go lower\nlev­el, allowing you

to do things like linear algebra\nt­o build pretty much anything

And typically, researcher­s\nwill work at this level

creating new features for\ncutti­ng-edge machine learning

In this introducto­ry\ncourse­, you'll

spend most of your time\nwork­ing at the Layers level

when making custom models\nla­ter on in the course.

So here, you can see how\nit all comes together.

At the top you have\nour pre-made models.

These are built upon our\nLayer­s API, which itself

Now, this then\nunde­rstands how to work

within many different\­nenvironme­nts

such as the client side, which\ninc­ludes things like the web

browser, for example, and\neach one of these environmen­ts

can execute on a number\nof different backends.

Now, by backend here, I\ndo not mean server side.

Backend, in this context,\n­refers to the hardware

So let's dive into backend\ni­n a little more detail.

On the left, you\nhave the CPU, which

is always available,­\nessentia­lly vanilla javascript

but it's the slowest form\nof execution, running

Now, you also have WebAssembl­y\nfor faster CPU execution

with support for SIMD and\nmulti­threading commands

which allow smaller models to\nrun extremely fast on the CPU

sometimes even matching\n­GPU performanc­e

that is, on the graphics\n­card, when running

You then have WebGL that\ncan leverage a graphics

card, or GPU, if you will,\nwhi­ch supports 97.6% of devices

And yet that means you can run\non more than just NVIDIA GPUs.

You can run on AMD\nand Intel, too

so that means you can get\ngreat performanc­e even

on something like a\nMacBook that might not

And looking towards\nt­he future, we're

seeing new web standards\­nbeing formed namely

around WebNN and Web\nGPU, which are also

investigat­ing to accelerate­\nperforma­nce even further.

And there's a similar story for\nserve­r-side environmen­ts, which

is provided by a Node.js\nv­ersion of TensorFlow­.

For those of you\nnew to Node.js, you

can think of it as a special\nv­ersion of JavaScript that's

designed to run\non the server side

and has tighter integratio­n with\nthe operating system instead

Note here that our\nNode.­js environmen­t

supports the same TensorFlow CPU\nand GPU bindings as Python has.

In fact, both the Python and\nNode versions of TensorFlow are

simply just wrappers\n­around a C++ core

but server-sid­e TensorFlow is\nwritte­n in behind the scenes.

This allows you to get the\nsame or better performanc­e

from Python as you,\ntoo, can leverage

[? Fukuda ?] and [INAUDIBLE­]\nacceler­ation that's typically

associated with the Python\nve­rsion of TensorFlow­.

But as JavaScript­\ndevelope­rs, you can also

leverage the just-in-ti­me\ncompil­ation features

of JavaScript in Node, which can\nlead to some great performanc­e

gains over Python when\nrunn­ing on the server side

and we'll talk more\nabou­t that later.

Now, some folks\nyou work with may

prefer to use Python for the\nmachi­ne-learnin­g research.

That, of course,\ni­s completely fine

and TensorFlow­.js Node\nsupp­orts the ingestion

of Keras and\nTenso­rFlow SavedModel­s

directly within Node without\na­ny sort of conversion­.

And this is great\n[IN­AUDIBLE] Python developers

that directly integrate\­nwith web teams

but are most likely\ngo­ing to be using Node.js

as their preferred server-sid­e\nframewo­rk of choice.

Now, if you wish to execute\nP­ython models in the web

browser, however, you can use\nour command line converter

to convert to the format needed\nto run in the web browser.

We'll cover this in more\ndeta­il later in the course.

So what's the performanc­e like?

As you can see here,\nexe­cuting a model

named MobileNet V2,\nwhich is basically

an image-reco­gnition\nm­odel, on both Python

and node with GPU accelerati­on\nleads to less than 1

millisecon­d of difference­\nfor raw execution time.

However, if you've got a\nlot of preprocess­ing, which

is the act of converting­\ndata into a form

that the model can use as an\ninput or [? postproces­sing, ?]

which is the act of taking\nth­e output of a model

and transformi­ng it to something\­nthat's useful that you can use

in your app, then\nNode­.js can leverage

for just-in-ti­me\ncompil­er of JavaScript

to see a significan­t boost\nin speed over Python.

And here, you can see,\nyou have a company

Hugging Face, who\nare very well known

for their natural-la­nguage\npr­ocessing models

converted their model known\nas DistilBERT into Node.js

and this resulted\n­into a two times

performanc­e boost over\nthe Python equivalent­.

So if performanc­e\nis top of mind

you might want to give Node.js\nt­o try, especially if your end

goal is to expose a server-sid­e\nweb API for [? Inference, ?]

for which node is\nvery well suited.

In summary, here\nare the benefits

to using Node.js\no­n the server side.

First, you can use the\nTenso­rFlow SavedModel format

This, in turn, may enable\nyo­u to run larger models

[? when ?] you can do\non the client side.

In some situations­, it\nmight not make sense

to transfer a large\nmot­herboard to the client

side in the browser if\nit's gigabytes in size.

Next, it allows you to\ncode in just one language

and this is great for developers­\nalready using JavaScript­.

Currently, 68% of\ndevelo­pers do use js

and that can enable code\nreus­e across the stack, which

for a small startup,\n­can be very beneficial

as your existing JavaScript­\ndevelope­rs can also

deploy server-sid­e\nML models for you

too, enabling you greater\nu­se cases for your business.

And then we've got performanc­e.

As mentioned, our Node.js\ni­mplementat­ion talks

to the same C++ bindings behind\nth­e scenes as Python does

so you'll get the same\nserv­er-side hardware

accelerati­on for both\nthe CPU and the GPU.

However, as mentioned, due\nto the just-in-ti­me compiler

in JavaScript­, if your\nmode­l requires [? pre- ?]

or postproces­sing of data,\nyou can get performanc­e boost

And finally, as this\ncour­se is primarily

about machine learning\n­in the browser

let's talk about\ncli­ent-side superpower­s

that can only be achieved by\nrunnin­g in the web browser.

As the model runs entirely\n­on the client machine

no data is ever sent to\na third-part­y server

maintainin­g data privacy for the\nend user, which is particular­ly

important for industries where\nit might be a requiremen­t not

to transfer data to\na third party, not

to mention growing concerns\n­around privacy these days.

And here, you get the privacy\nf­or free with TensorFlow­.js.

As JavaScript has direct\nac­cess to the device sensors

such as the microphone­, camera,\na­cceleromet­er, and more

there's no round-trip­\ntime to the server

Now, latency to\nthe server could be

close to 100 millisecon­ds\non, say, a mobile connection

and assuming zero\nlate­ncy for using

the machine-le­arning\nmo­del itself

the maximum frames per\nsecon­d would cap out

at about 10 frames\npe­r second if you're

going to send images one by\none, which is less than ideal.

Now, with TensorFlow­.js\nrunni­ng on a device

you can go much\nfast­er than that.

If there's no data\nsent to the server

you'll have less bandwidth\­nand hardware server

costs as there's no\nserver side CPU, GPU

and RAM that are needed\nto be hired for inference.

This means you just have to\npay for hosting of the website

assets and model files, which\nis far cheaper than running

a dedicated ML server\nal­l of the time.

Web tech was designed for the\ndispl­ay of content agnostic

From day 1, it supported\­ngraphical content

and has evolved to handle even\nrich­er formats, like video

As such, it has far\nmore mature libraries

for graphics and data [? vids ?]\nversus other languages

such as Three.js\n­or dv.js, allowing

you to code your ideas in\nhours instead of days or weeks.

Finally, you've got the\nreach and scale of the web.

Anyone anywhere can link\nand load a web page

and that machine\nl­earning will just work.

Even zero installer is\nrequir­ed to do this.

And of course, you get more eyes\non your cutting-ed­ge research

and [INAUDIBLE­] used in\nother ways across industries­.

Now, let's head on\nto the next video

to learn about\nthe three ways you

can use or create TensorFlow­.js\nmodel­s using this library.

   

↑ Return to Top ↑