powered by
Machine Learning models learn from data. In Deep Learning, the more data we have the better the performance.
We need to transform the input data in a way the algorithms understand
The idea is to test your model with data never seen before
Hyperparameter configuration and fit the model with the training set and the target label
Test and evaluate the model using the test set and metrics
Once we're happy with the results, we save and convert the model to be consumed by Javascript.
Project set up. Most important:
$> npm install @tensorflow/tfjs
TensorFlow.js is a JavaScript library for training and deploying ML models in the browser and on Node.js
Our component needs to import tensorflowjs and load the converted model:
import * as tf from '@tensorflow/tfjs'
//...
async function loadModel(path) {
postMessage({ loading: true })
const model = await tf.loadModel(path)
postMessage({ loading: false })
return model
}
Same preprocessing as the one we did during model training. Luckily TensorFlow.js provides us with a nice API to help us with this
//...
new Promise(resolve => {
const source = str2int(value)
const onehotSource = tf.oneHot(tf.tensor1d(source, 'int32'), numClasses)
const reshapedSource = onehotSource.reshape([numSamples].concat(onehotSource.shape))
//...
})
Common principle that underlies all supervised machine learning is that they learn a mapping of input to output
In most cases the output of the prediction needs to be converted back to a format we can easily display
//...
const prediction = model.predict(/*...*/)
const date = prediction.reduce((acc, pred) => {
const pIdx = pred
.reshape([lenMachineVocab])
.argMax()
.get()
return acc + invMachineVocab[pIdx]
}, '')
//...