Skip to main content

Data & Derivatives

In this step we will extend our server so that we can list models, upload them, and prepare them for viewing.

Data management

First, let's make sure that our application has a bucket in the Data Management service to store its files in. Typically the bucket would be created just once as part of a provisioning step but in our sample we will implement a helper function that will make sure that the bucket is available, and use it in other parts of the server app.

Create a new file under the services/forge subfolder, and call it oss.js. This is where will implement all the OSS (Object Storage Service) logic of our server application. Populate the new file with the following code:

services/forge/oss.js
const fs = require('fs');
const { BucketsApi, ObjectsApi } = require('forge-apis');
const { FORGE_BUCKET } = require('../../config.js');
const { getInternalToken } = require('./auth.js');

async function ensureBucketExists(bucketKey) {
try {
await new BucketsApi().getBucketDetails(bucketKey, null, await getInternalToken());
} catch (err) {
if (err.response.status === 404) {
await new BucketsApi().createBucket({ bucketKey, policyKey: 'temporary' }, {}, null, await getInternalToken());
} else {
throw err;
}
}
}

async function listObjects() {
await ensureBucketExists(FORGE_BUCKET);
let resp = await new ObjectsApi().getObjects(FORGE_BUCKET, { limit: 64 }, null, await getInternalToken());
let objects = resp.body.items;
while (resp.body.next) {
const startAt = new URL(resp.body.next).searchParams.get('startAt');
resp = await new ObjectsApi().getObjects(FORGE_BUCKET, { limit: 64, startAt }, null, await getInternalToken());
objects = objects.concat(resp.body.items);
}
return objects;
}

async function uploadObject(objectName, filePath) {
await ensureBucketExists(FORGE_BUCKET);
const buffer = await fs.promises.readFile(filePath);
const results = await new ObjectsApi().uploadResources(
FORGE_BUCKET,
[{ objectKey: objectName, data: buffer }],
{ useAcceleration: false, minutesExpiration: 15 },
null,
await getInternalToken()
);
if (results[0].error) {
throw results[0].completed;
} else {
return results[0].completed;
}
}

module.exports = {
listObjects,
uploadObject
};

The ensureBucketExists function will simply try and request additional information about a specific bucket using the BucketsApi class from the Forge SDK, and if the response from Forge is 404 Not Found, it will attempt to create a new bucket with this name.

As you can see, the getObjects method of the ObjectsApi class (responsible for listing files in a Data Management bucket) uses pagination. In our code we simply iterate through all the pages and return all files from our application's bucket in a single list.

Derivatives

Next, we will implement a couple of helper functions that will derive/extract various types of information from the uploaded files - for example, 2D drawings, 3D geometry, and metadata - that we can later load into Forge Viewer in our webpage. To do so, we will need to start a new conversion job in the Model Derivative service, and checking the status of the conversion.

Model Derivative service requires all IDs we use in the API calls to be base64-encoded, so we include a small utility function that will help with that.

info

Base64-encoded IDs are referred to as URNs.

Create another file under the services/forge subfolder, and call it md.js. This is where will implement the logic for converting designs for viewing, and for checking the status of the conversions. Populate the new file with the following code:

services/forge/md.js
const { DerivativesApi } = require('forge-apis');
const { getInternalToken } = require('./auth.js');

async function translateObject(urn, rootFilename) {
const job = {
input: { urn },
output: { formats: [{ type: 'svf', views: ['2d', '3d'] }] }
};
if (rootFilename) {
job.input.compressedUrn = true;
job.input.rootFilename = rootFilename;
}
const resp = await new DerivativesApi().translate(job, {}, null, await getInternalToken());
return resp.body;
}

async function getManifest(urn) {
try {
const resp = await new DerivativesApi().getManifest(urn, {}, null, await getInternalToken());
return resp.body;
} catch (err) {
if (err.response.status === 404) {
return null;
} else {
throw err;
}
}
}

function urnify(id) {
return Buffer.from(id).toString('base64').replace(/=/g, '');
}

module.exports = {
translateObject,
getManifest,
urnify
};

Server endpoints

Now let's make the new functionality available to the client through another set of endpoints.

Create a models.js file under the routes subfolder with the following code:

routes/models.js
const express = require('express');
const formidable = require('express-formidable');
const { listObjects, uploadObject } = require('../services/forge/oss.js');
const { translateObject, getManifest, urnify } = require('../services/forge/md.js');

let router = express.Router();

router.get('/', async function (req, res, next) {
try {
const objects = await listObjects();
res.json(objects.map(o => ({
name: o.objectKey,
urn: urnify(o.objectId)
})));
} catch (err) {
next(err);
}
});

router.get('/:urn/status', async function (req, res, next) {
try {
const manifest = await getManifest(req.params.urn);
if (manifest) {
let messages = [];
if (manifest.derivatives) {
for (const derivative of manifest.derivatives) {
messages = messages.concat(derivative.messages || []);
if (derivative.children) {
for (const child of derivative.children) {
messages.concat(child.messages || []);
}
}
}
}
res.json({ status: manifest.status, progress: manifest.progress, messages });
} else {
res.json({ status: 'n/a' });
}
} catch (err) {
next(err);
}
});

router.post('/', formidable(), async function (req, res, next) {
const file = req.files['model-file'];
if (!file) {
res.status(400).send('The required field ("model-file") is missing.');
return;
}
try {
const obj = await uploadObject(file.name, file.path);
await translateObject(urnify(obj.objectId), req.fields['model-zip-entrypoint']);
res.json({
name: obj.objectKey,
urn: urnify(obj.objectId)
});
} catch (err) {
next(err);
}
});

module.exports = router;
info

The formidable() middleware used in the POST request handler will make sure that any multipart/form-data content coming with the request is parsed and available in the req.files and req.fields properties.

And mount the router to our server application by modifying server.js:

server.js
const express = require('express');
const { PORT } = require('./config.js');

let app = express();
app.use(express.static('wwwroot'));
app.use('/api/auth', require('./routes/auth.js'));
app.use('/api/models', require('./routes/models.js'));
app.listen(PORT, function () { console.log(`Server listening on port ${PORT}...`); });

The router will handle 3 types of requests:

  • GET /api/models - when the client wants to get the list of all models available for viewing
  • GET /api/models/:urn/status - used to check the status of the conversion (incl. error messages if there are any)
  • POST /api/models - when the client wants to upload a new model and start its translation

Try it out

Start (or restart) the app as usual, and navigate to http://localhost:8080/api/models in the browser. The server should respond with a JSON list with names and URNs of all objects available in your configured bucket.

info

If this is your first time working with Forge, you may get a JSON response with an empty array ([]) which is expected. In the screenshot below we can already see a couple of files that were uploaded to our bucket in the past.

tip

If you are using Google Chrome, consider installing JSON Formatter or a similar extension to automatically format JSON responses.

Server Response