###Getting Started The blade-common package is a Sails add-on that, when installed in another Sails application, will add common models, routes, controllers, and services that you will have available to you in your Blade Sails service or adapter.
##Install Normally, you would just npm install this from Blade's private npm registry, but, as this package is still under development, installing in this way will only get you what has been committed to the 'master' branch of the git repo. If this is what you want, then:
npm install https://github.com/bladesystem/blade-common.git
######Note: (blade-common will eventually reside in the Blade private npm repository, but not until we have a version 1.0 release)
However, if you want to take advantage of what's being developed in the development
branch of the repo, then you'll need to create a symbolic link from your project's node_modules
directory to the development
branch of this package:
- Checkout the
development
branch ofblade-common
somewhere on your local machine. cd
into theroot directory
of your project (the project where you want to includeblade-common
) and do an npm install of this package (as above).cd
into thenode_modules
directory of your project and delete theblade-common
directory that was just created there.- Create a symlink in the
node_modules
directory that refers to theroot directory
of theblade-common
source that you created in step 1.
######Note: If you do an npm update
or npm install
, it will replace the symlink that you created with source from the master
branch and you'll have to create the symlink again.
##Service Object
Within a Blade service, you can call out to other blade services by using the Service object provided in this package. The Service object is a Sails service object, and, as such will be available to you anywhere. You don't have to require
anything, it's just there waiting for you to use it.
One of the first things that you want to do (in your /config/bootstrap.js
file is to register your service. You use the Service
object to do this:
module.exports.bootstrap = function (cb) {
Service.register({name: 'reporting', type: 'service'})
.then(function() {
cb();
});
};
Valid types are service
and adapter
(although there could be more types in the future).
If you register a service
, then the auto-discovery mechanism starts. The Service
object uses the auto-discovery facility to find other Blade services without you having to provide any additional information in your source.
You use the Service object to make requests on other services:
Service.request('service.client').get('/clients/1234)
.then(function(client) {
//use the client object
}).catch(function(error) {
//do something with the error
}).finally() {
//do something regardless of whether the request succeeded or failed
};
Service.request()
returns a request object with HTTP methods that you can call if the service is available, or a rejected promise if the service is not available.
You can use the .get()
, .put()
, .post()
, or .delete()
methods on the request object to make HTTP requests on the service that was specified in the Service.request()
call. The request object returns a promise with the result.
###Service Mocks In order to allow for automated testing (and continuous integration / continuous deployment), other services will not be running when your test code is executed. A Service mocking facility is available to you for just this purpose.
This facility is very flexible in how it allows you to define your Service Mock objects. You can use whatever mocking library that you want as long as it provides a mechanism to mock http requests.
We suggest using nock.js as it has been tested with and works very well with the blade-common Service Mock facility.
The nock
library intercepts outgoing http requests and responds with the values you set.
###Using Mocks in Testing
There are several ways to use the Service Mocks facility in your testing code. The first is to simply add it in-line to your test code:
####Using Service Mocks in-line in your tests
var expect = require('chai').expect;
var supertest = require('supertest');
var nock = require('nock');
var Promise = require('bluebird');
var sails = require('sails');
nock.enableNetConnect();
describe("Controller Test", function () {
describe("POST /doit", function () {
it("should do it!", function (done) {
//set up the mock for the client service
Service.addMock('service.client', function() {
return nock("http://service.client")
.get("/client/1234")
.reply(200, {
data: 'the data'
});
//test it
supertest(sails.hooks.http.app)
.post('/doit')
.send({
client: 1234,
thingToDo: 'foo'
})
.expect(201, done);
});
});
});
The first parameter to the Service.addMock
call is the name of the service that needs to be mocked for the test. In this case, we're testing code that contains a request to the client service, so we use 'service.client'
as the service name that we are mocking. This will create a mocked object that will be used instead of trying to make a request on the client service that won't be running when our tests are being run.
The second parameter to the Service.addMock
call is a function that returns the object that will be mocking the client service (in this case, a nock
interceptor, but, again, it could be an object from another mocking library as long as it allows you to define what should be returned from a specific http request).
Our convention for the mocked host url is http://service.name
. This is just a convention - it doesn't correspond to anything in the 'real' world, and, if you wanted to use a different url, that would be fine.
For complete documentation on nock, check out their documentation on github.
####Re-using your mocks in multiple tests
If you want to reuse your mocks in other tests, then you can create a file called mocks.js
where you would define your service mocks.
A typical mocks.js
file looks like this:
var nock = require('nock');
nock.enableNetConnect();
module.exports = {
'service.client': function() {
return nock("http://service.client")
.get("/foo")
.reply(200, {
"client": "foo",
"account": "bar"
})
.get("/baz")
.replyWithFile(200, __dirname + '/serviceMocks/replies/bazResponse.json');
},
'service.foo': function() {
return nock("http://service.foo")
.get("/bar/xyz")
.reply(200, {
foo: 1,
bar: 0
});
}
}
In order to use the mocks.js
file in your tests, you need to add a mocks
property in the configs
section of your test/bootstrap.test.js
file:
.
.
.
var configs = {
log : {
level: 'info'
},
connections: {
memory: {
adapter: 'sails-memory'
}
},
models : {
connection: 'memory'
},
hooks: {
grunt: false
},
mocks: {
path: __dirname + '/serviceMocks/mocks.js'
}
};
.
.
.
This particular configuration indicates that the mocks.js
file is in a directory called serviceMocks
under the test
directory. Notice that the mocks.js
file defined above uses that same directory to contain the responses to various mocked endpoints.
####Note
If you create a mocks.js
file (and any corresponding response files), you should check them into your project's git repo as they are part of your test suite and will be used when automated tests are run for continuous integration and deployment.
###Using Mocks During Development The Blade service you are writing probably will need to collaborate with one or more other Blade services in order to fulfill it's particular functional responsiblity. While we have tried to make collaborating with other Blade Services as easy as possible during local development, there may be times where you need more control over a specific service's response to a particular route, or you may be writing functionality that depends on a service that hasn't even been written yet!
You can use Service Mocks outside of the testing environment to facilitate these kinds of local development scenarios.
####Configuration
The simplest way to use mocks in your local development is to create a mocks.js
file in your project's config
directory. It should be constructed in exactly the same way as the mocks.js
file described above for use in your tests. You should add this file to your .gitignore file so that it won't be checked into your project's repo. It should only be present in your local development environment.
In order to enable the mocks.js
file, you need to add a mocks
configuration element in your config/local.js
configuration file and set it to true
.
module.exports = {
log: { level: 'info' },
mocks: true,
.
.
.
####Alternate Configurations
If you don't want to put your mocks.js
in your project's config
directory (say you want to use the one that you already have in your test
directory), then you can specify the path to the mocks.js
file that you want to use (exactly like you would in the bootstrap.text.js
file for tests).
mocks: { path: __dirname + '/../test/serviceMocks/mocks.js' }
####Service Mock Behavior During Development
Service Mocks in use during development will work exactly as they do during testing by replacing the external service requests with the mocked results as long as the mocked service is NOT actually running in your local environment! If you actually run the service that is being mocked in your local environment, this will override the mocked service and any calls to Service.request()
on the mocked service will make an actual request to the locally running service.
This is useful so that you can easily switch from using a mocked service to the actual running service without ever having to shutdown and restart your service or change your configuration by simply starting and stopping the service that you are mocking.
###USE_MOCKS Environment Variable
In order for Service mocks to work in your tests or during development, the USE_MOCKS environment variable must be set. This is done automatically when you use the npm test
script on *nix systems, or when you set the mocks
configuration property to true
or assign it a valid path to a javascript file with your mock definitions.
The npm test
script sets the NODE_ENV
environment variable to test
, and, as a result, the USE_MOCKS
environment variable is automatically set to true
when the npm test
script is run from the command line, so you don't have to set the USE_MOCKS
environment variable directly.
Unfortunately, the npm test
script does not work on Windows systems and testing must be performed another way. If you are using the JetBrains WebStorm IDE, you can set the NODE_ENV
to test
in a run-configuration and run your tests that way and everything will work as expected.
#######If anyone can figure out how to set the NODE_ENV
environment variable in the npm test script so that it works correctly on Windows Systems, please do so!
Setting the mock
configuration property in your config/local.js
file will also set this environment variable if and only if there there is either a mocks.js
file in your config
directory, or you have provided a path to a valid file with valid service mock definitions.
Under normal circumstances, you should never have to set or modify the USE_MOCKS
environment variable directly.
##Data Seeding The blade-common package will automatcially seed your models whenever you register your service or adapter. This is necessary for testing, but is also useful in normal development if you have data that needs to be loaded before anything else happens. In order for seed data to be loaded into your models, add the following outside your model's attribute section in your model definition:
seedData: <path to JSON data to load>
For example, here's the definition for the Currency Model:
module.exports = {
autoPK: false,
attributes: {
code: {
type: 'string',
size: 3,
required: true,
unique: true,
primaryKey: true
},
symbol: {
type: 'string',
required: true
},
name: {
type: 'string',
required: true
},
symbol_native: 'string',
decimal_digits: {
type: 'integer',
required: true
},
rounding: 'integer',
name_plural: 'string'
},
seedData: __dirname + '/../../lib/data/currencies.json'
};
You can put your seed data anywhere you wish, just provide a relative path to it in the seedData section of your model definition.
The seed data functionality will not load data into the model if there is already data there. It will only load data if there is NO data already present.
##REST Responses The blade-common pckage extends the Sails response object with several responses that will assist in writing REST enpoints that are consistent with all other packages in the Blade Payment System.
###2xx Success
####200 OK
####201 Created
###4xx Client Error
####400 Bad Request
####401 Unauthorized
####403 Forbidden
####404 Not Found
###5xx Server Error
####500 Internal Server Error
##Query Service
To return one result from the query service, matching the primary key by the matching parameter in the request:
QueryService.findOne(model, req, {pkParamName: "primaryKeyName"})
Or if the request parameter key matches the primary key id of the model, you can simply call:
QueryService.findOne(model, req)
To return a collection of results querying by any field(s), where the model field matches the parameter in the request:
QueryService.find(model, req, {getBy: "field1"})
QueryService.find(model, req, {getBy: ["field1", "field2", ..]})
To map the model field to a parameter in the request that is named differently:
QueryService.find(model, req, {getBy: {id: "client_id"} })
and to mix and match fields that match the request and don't match the request:
QueryService.find(model, req, {getBy: {id: "client_id", accountId: true} })
All getBy fields are optional - if the key is not in the request parameters, they are not added to the criteria.
Additionally, the following fields are available in the URL query string to get specific data from routes that utilize the QueryService:
where
/retreivemodels?where={"somefield":"somevalue"}
limit and skip - 10 and 0 by default
/retreivemodels?limit=20&skip=20
sort
/retrivemodels?where={"type":"foo"}&sort=date DESC
select - retrieves only specified fields, uses a comma delimited string
/retrievemodels?select=field1,field2
populate - populates specified relationship fields, uses a comma delimited string. the query service does not populate by default and returns the only the related model's primary key.
/retrievemodels?populate=relationship1,relationship2
can also just pass "all" to populate all relationship fields
/retrievemodels?populate=all
##KYC Service
The KYC Service takes multipart file upload requests, validates them, and streams them to our Amazon S3 Bucket. It then calls the Images Service to create and store the KYC document metadata.
To use this facility, you will need:
Your AWS Access Key and AWS Secret Access Key.
These are supplied to the KYC service through the environment variables AWS_ACCESS_KEY_ID
and AWS_SECRET_ACCESS_KEY
You will also need to define a directory for files to upload locally first for validation. You can define this in /config/uploadDirectory.js
Standard Boilerplate for using the KYCService in your route:
KYCService.upload(req, "client_id", "blade_token")
.then(function(data){
return res.created(data)
})
.catch(function(err){
res.header('Connection', 'close');
return res.badRequest(err)
})