Digital Ocean Functions

Digital Ocean Functions

How to run a nodeJS script with dependencies via a DigitalOcean Functions as a web URL

What is "Digital Ocean Functions" ? This is more or less the same as AWS Lambda.

Let's say we need to feed in two numbers to some script and the script will return some really heavy duty calculations based on those two numbers and return the result. For simplicity, let's say the function is addition. So I feed in 2 and 3 to DO/Functions/projectX/add - and after 0.05 seconds it returns 5. Good. This would normally take 1 second on a DO droplet which is 20 times slower. But the droplet won't charge anything extra for making that calculation. DO Functions however will charge 1 cent for each calculation. So if we make 100 calculations to Function we'll be billed $1 whereas our existing droplet won't charge a cent extra. But when we have million requests sent for calculations then our droplet will be overloaded since its a VPS and the webpage for that section will be slow for our users whereas Functions can take any high load and won't slow down the website since we're offloading the addition calculation part to another service which returns the result 20 faster too - but at the same time we'll be billed $10,000 for using Functions a million times. So it's a call for business versus technology.

[ Actual pricing is here - - which gives 90,000 GB-seconds of compute free per month per account. So you're good for starting ]

We'll try to multiply 2 matrices using an external library mathjs instead of writing own code to multiply matrices to use external dependencies in nodeJS.

Before this, we need to install doctl - - which is a command line tool for DigitalOcean services.

mkdir test-node
cd test-node
mkdir -p packages/cloud/matMul
cd packages/cloud/matMul/
npm init -y
npm install --package-lock-only mathjs
touch index.js

Type this code in index.js

const math = require('mathjs');

async function main(args)
    const matrix_1 = JSON.parse("[" + args.m1 + "]");
    const matrix_2 = JSON.parse("[" + args.m2 + "]");

    const matrix_result = math.multiply(matrix_1, matrix_2);

    return { "body" : { matrix_1, matrix_2, matrix_result } };

module.exports.main = main;
cd ../../..
touch project.yml

Type this in project.yml

  - name: cloud
      - name: matMul
        binary: false        
        runtime: 'nodejs:default'
        web: true
        parameters: {}
        environment: {}
        annotations: {}
          timeout: 5000
          memory: 256
doctl serverless connect
Connected to function namespace 'fn-xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx' on API host ''

doctl serverless deploy .

Deploying '/Users/anjanesh-mac/workspace/serverless/functions/test-node'
  to namespace 'fn-xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx'
  on host ''
Started running npm install --production in /Users/anjanesh-mac/workspace/serverless/functions/test-node/cloud/matMul
Finished running npm install --production in /Users/anjanesh-mac/workspace/serverless/functions/test-node/cloud/matMul
Deployment status recorded in '.nimbella'

Deployed functions ('doctl sbx fn get <funcName> --url' for URL):
  - cloud/matMul

doctl serverless functions get cloud/matMul --url

Visit the URL in the browser, you'll get this :

  "matrix_1": [[1, 2], [3, 4]],
  "matrix_2": [[5, 6], [7, 8]],
  "matrix_result": [[19, 22], [43, 50]]

which you can ultimately use in any script by parsing this JSON repsonse.

Now, I tried offloading a similar matMul script making use of the TensorFlow.js library (const matrix_result = tf.matMul(matrix_1, matrix_2);) which requires tensorflow/tfjs and tensorflow/tfjs-node

npm install --package-lock-only @tensorflow/tfjs @tensorflow/tfjs-node

But I keep getting While deploying action 'cloud/matMul': 413 Payload Too Large

When I checked the size of the folder packages/cloud/matMul/node_modules using du -hs packages/cloud/matMul/node_modules/ I get some 700+MB. So am not sure if we can run scripts that depend on heavy modules. But the whole point of Functions is to be able to deploy intense calculations that would otherwise take a lot of time on a general purpose VPS / droplet offloaded on a single server receiving all the hits / traffic. I've sent in a support to DigitalOcean regarding this.

Reference :

Did you find this article valuable?

Support Anjanesh Lekshminarayanan by becoming a sponsor. Any amount is appreciated!