Skip to content

Service Tests

I created a complete test suite depending on the alinex validator, which allows to test web services in a easy and fast way. This is based on the mocha test runner.

This allows to make functional tests of the web services which may be automated using any scheduling tool. Therefore a bash wrapper around the test may translate into the monitoring tool syntax... Run this test continually to check that the system really works.

Setup

First create a new project and add the following package.json file:

{
    "name": "testing",
    "version": "0.1.0",
    "scripts": {
        "test": "mocha -r node_modules/ts-node/register test/mocha/*.ts"
    },
    "dependencies": {
        "@alinex/datastore": "^1.6.0",
        "@alinex/validator": "^3.4.2",
        "axios": "^0.19.0",
        "axios-debug-log": "^0.6.2",
        "chai": "^4.2.0",
        "mocha": "^6.2.1"
    },
    "devDependencies": {
        "@types/chai": "^4.2.3",
        "@types/mocha": "^5.2.7",
        "@types/node": "^12.7.8",
        "ts-node": "^8.2.0",
        "typescript": "^3.6.3"
    }
}

And afterwards let npm install all of that for you:

npm install

Helper

To make the tests itself as simple as possible I added two helper modules:

const debugError = require('debug')('http:error');
const debugRequest = require('debug')('http:request');
const debugRequestHeader = require('debug')('http:request:header');
const debugResponse = require('debug')('http:response');
const debugResponseHeader = require('debug')('http:response:header');
const debugResponseData = require('debug')('http:response:data');

require('axios-debug-log')({
    request: function(debug: any, config: any) {
        debugRequest('%s %s', config.method.toUpperCase(), config.url);
        debugRequestHeader(
            '%O',
            Object.keys(config.headers[config.method]).length
                ? config.headers[config.method]
                : config.headers.common
        );
        //        debugRequest('Request with %O', config);
    },
    response: function(debug: any, response: any) {
        debugResponse('%s %s', response.status, response.statusText);
        debugResponseHeader('%O', response.headers);
        debugResponseData('%O', response.data.toString());
        //        debugResponse('Response with %O', response);
    },
    error: function(debug: any, error: any) {
        // Read https://www.npmjs.com/package/axios#handling-errors for more info
        debugError(error.message);
        // debugError('Error %O', error);
    }
});

After include it will debug the HTTP traffic depending on environment setting DEBUG=http:*.

import { inspect, promisify } from 'util';
import { DataStore, Options } from '@alinex/datastore';
import { Validator } from '@alinex/validator';
import { importList } from '@alinex/validator/lib/schema';
import * as dns from 'dns';

require('./debug'); // include HTTP debugging
const lookup = promisify(dns.lookup);

const DEFAULT_TIMEOUT = 10000;
const DEFAULT_WARN_PERCENT = 0.5;

export function schemaTest(
    name: string,
    before: Function,
    path: string,
    setup: Options,
    schema: any
) {
    // description using VERBOSE=1 in environment
    if (process.env.VERBOSE)
        before(() =>
            console.log(
                `        ${name} should return ${inspect(schema, {
                    colors: true,
                    depth: 4
                }).replace(/\n/g, '\n        ')}`
            )
        );
    return (part: string, url: string, opt?: Options, timeout = DEFAULT_TIMEOUT) => {
        part = name.toLocaleLowerCase() + ' ' + part;
        url = url + path;
        opt = { ...setup, ...opt };
        const ds = new DataStore();
        // run test
        it(`should get data for ${part}`, async function() {
            if (process.env.VERBOSE) {
                console.log(`        Call ${name}: ${url}`);
                console.log(`        Options: ${inspect(opt)} with timeout ${timeout}`);
            }
            await lookup(new URL(url).hostname); // test
            const start = new Date();
            await ds.load({ source: url, options: opt });
            const end = new Date();
            if (process.env.VERBOSE)
                console.log(`        Response: ${inspect(ds.data).replace(/\n/g, '\n        ')}`);
            if (end.getTime() - start.getTime() > timeout * DEFAULT_WARN_PERCENT) {
                console.log(
                    `        Request took ${end.getTime() - start.getTime()}ms, that's too long!`
                );
                this.skip();
            }
        }).timeout(timeout);
        it(`should validate by schema for ${part}`, async function() {
            if (typeof ds.data === 'object' && !Object.keys(ds.data).length) this.skip();
            if (!schema.constructor.name.match(/Schema$/)) schema = await importList(schema);
            const val = new Validator(schema);
            return val.load({ data: ds.data });
        });
    };
}

This defines a method which will run two tests if called:

  1. request and parse the data from the service
  2. validate the resulting data structure against the schema

The check will succeed if everything works fine, it will fail if

  • dns name could not be resolved
  • data could not be retrieved
  • validation failed

And it will skip if everything works but it took more than 50% (DEFAULT_WARN_PERCENT) of the maximum time to retrieve - this is assumed as WARNING state. The default time for a service is set to 10 seconds (DEFAULT_TIMEOUT), but it can be specified on call to this helper to change it for one test, too.

Test Methods

All tests should to be written in 'test/mocha/...ts' like:

The test is used in two steps:

  1. Definition
  2. Calling the test

Together this looks like:

import { StringSchema } from '@alinex/validator/lib/schema';
import { schemaTest } from '../../../lib/service/helper';

describe('my test', () => {
    // name of test, before function (from describe), path to call, options, schema
    const ping = schemaTest('Ping', before, '/do/ping', {}, new StringSchema({ allow: ['OK'] }));
    // this can be called multiple times
    ping('loadbalancer', 'https://my-site.de');
    ping('node-1', 'http://192.168.1.100:8080');
    ping('node-2', 'http://192.168.1.100:8080');
}

This will run the tests and shows something like:

npm run test
  my test
    ✓ should get data for ping loadbalancer (71ms)
    ✓ should validate by schema for ping loadbalancer
    ✓ should get data for ping node-1 (71ms)
    ✓ should validate by schema for ping node-1
    ✓ should get data for ping node-2 (71ms)
    ✓ should validate by schema for ping node-2

  6 passing (237ms)

Info

Keep in mind that if a test is pending, it's response was too late.