getting-started-with-apollo-server-dataloader-knex.mo

Owner: Thomas Pucci​

Prerequisites (~12min)

    Have Yarn installed (~5min)
    Have Docker and Docker-compose installed (~5min)
    Have Postman installed (~2min)

Thanks

Thanks to Tycho Tatitscheff and Yann Leflour for helping me with their great BAM API repo​

Context

During this standard, we will create a Heroes graphQL API. We will have a Hero model with superheroes real and hero names. We will add one example of association.
Our API will be lightly protected and use batching to minimise DB round-trips.

Steps (~61min)

Note: You should commit between each step.

Initialise a new project (~6min)

    Create and go to a new directory for the project: mkdir graphql_formation && cd graphql_formation
    Init a git repository: git init
    Create two services with Docker-compose, one postgres database and one node server:
      For this step, notice that our final folder architecture looks like this:
      1
      πŸ“‚ graphql_formation
      2
      β”œ πŸ“‚ api
      3
      β”‚ β”” πŸ—‹ Dockerfile
      4
      β”œ πŸ“‚ db
      5
      β”‚ β”” πŸ—‹ Dockerfile
      6
      β”œ πŸ—‹ config.env
      7
      β”” πŸ—‹ docker-compose.yml
      Copied!
      Make sure your local 3000 port is available as we will use this port to reach our API
      In a new api/Dockerfile file, write all the commands to assemble the API image:
    1
    FROM node:8.1.0-alpine
    2
    ​
    3
    WORKDIR /usr/src/api
    4
    ​
    5
    EXPOSE 3000
    6
    CMD ["yarn", "run", "serve"]
    Copied!
      In a new db/Dockerfile file, write all the commands to assemble the db image:
    1
    FROM postgres:9.6.3
    Copied!
      In a new docker-compose.yml file, declare the two services:
    1
    version: '3'
    2
    services:
    3
    api:
    4
    build:
    5
    context: ./api
    6
    image: heroes-api
    7
    env_file: config.env
    8
    volumes:
    9
    - ./api:/usr/src/api
    10
    ports:
    11
    - 3000:3000
    12
    links:
    13
    - db:db
    14
    db:
    15
    build:
    16
    context: ./db
    17
    env_file: config.env
    18
    image: heroes-db
    19
    ports:
    20
    - 5431:5432
    Copied!
      In a new config.env file, declare your environnement variable for these Docker containers:
    1
    POSTGRES_USER=heroesuser
    2
    POSTGRES_PASSWORD=heroespassword
    3
    POSTGRES_DB=heroesdb
    4
    PGDATA=/data
    5
    DB_HOST=db
    Copied!
    Build these services with the command: docker-compose build
CHECK 1: Your terminal should prompt successively these lines confirming Docker images have been built:
Successfully tagged heroes-db:latest
Successfully tagged heroes-api:latest

Install nodemon and run our project (~5min)

    Add this to the project .gitignore: echo "node_modules" > .gitignore
    In the api folder, interactively create a api/package.json file: cd api && yarn init
    Add nodemon, babel-cli, babel-plugin-transform-class-properties, babel-preset-flow and babel-preset-es2015 to our dev dependencies: yarn add nodemon babel-cli babel-plugin-transform-class-properties babel-preset-es2015 babel-preset-flow -D
    In a new api/.babelrc file, write the babel configuration:
    1
    {
    2
    "presets":[
    3
    "es2015",
    4
    "flow"
    5
    ],
    6
    "plugins":[
    7
    "transform-class-properties"
    8
    ]
    9
    }
    Copied!
    In our api/package.json, write the command to launch the server:
    1
    "scripts": {
    2
    "serve": "nodemon index.js --exec babel-node"
    3
    }
    Copied!
    Create a new empty file api/index.js
    Go back to the root of the project: cd ..
    Run the project: docker-compose up
CHECK 1: You terminal should prompt the logs of the two containers together with two different colors
CHECK 2: From another terminal, you can access the API and see the following folder structure: docker-compose exec api /bin/sh then inside the container: ls -lath;
1
drwxrwxr-x 3 node node 4.0K Aug 17 12:37 .
2
-rw-rw-r-- 1 node node 0 Aug 17 12:37 index.js
3
drwxrwxr-x 222 node node 12.0K Aug 17 12:37 node_modules
4
-rw-rw-r-- 1 node node 426 Aug 17 12:37 package.json
5
-rw-rw-r-- 1 node node 66.2K Aug 17 12:37 yarn.lock
6
-rw-rw-r-- 1 node node 86 Aug 17 12:32 Dockerfile
7
drwxr-xr-x 3 root root 4.0K Aug 3 11:50 ..
Copied!
Exit with: CTRL-D
CHECK 3: You can access the db and prompt the PostgreSQL version: docker-compose exec db psql -U heroesuser -d heroesdb then inside the container: select version();
1
PostgreSQL 9.6.3 on x86_64-pc-linux-gnu, compiled by gcc (Debian 4.9.2-10) 4.9.2, 64-bit
Copied!
Exit with: CTRL-D

Create a koa server (~3min)

    Install koa and koa-router in our API: cd api && yarn add koa koa-router
    In the index.js file, create our server:
1
import Koa from 'koa';
2
import koaRouter from 'koa-router';
3
​
4
const app = new Koa();
5
const router = new koaRouter();
6
​
7
router.get('/', ctx => {
8
ctx.body = 'Hello World!';
9
});
10
​
11
app.use(router.routes());
12
app.use(router.allowedMethods());
13
app.listen(3000);
14
​
15
console.log('Server is up and running');
Copied!
CHECK 1: In your terminal which run docker-compose, you should see Server is up and running
CHECK 2: Hitting localhost:3000 should return Hello World!: curl localhost:3000

Create a presentation layer with graphQL (~6min)

This layer will let our API know how to present data: what data one user can query? How should front end query this data (fields, root queries, sub queries...)?
    Install graphQL, graphQL Server Koa, graphQL tools and Koa body-parser: yarn add graphql graphql-server-koa graphql-tools koa-bodyparser
    In a new folder api/presentation add a new schema.js file describing a simple graphQL schema:
1
import { makeExecutableSchema } from 'graphql-tools';
2
​
3
const typeDefs = [`
4
type Hero {
5
id: Int!
6
firstName: String
7
lastName: String
8
}
9
​
10
type Query {
11
heroes: [Hero]
12
}
13
​
14
schema {
15
query: Query
16
}
17
`];
18
​
19
const resolvers = {
20
Query: {
21
heroes: () => ([
22
{
23
id: 1,
24
firstName: 'Clark',
25
lastName: 'Kent',
26
},
27
{
28
id: 2,
29
firstName: 'Bruce',
30
lastName: 'Wayne',
31
}
32
]),
33
},
34
}
35
​
36
const schema = makeExecutableSchema({ typeDefs, resolvers });
37
​
38
export default schema;
Copied!
    In the api/index.js file, add our api endpoint:
1
import koaBody from 'koa-bodyparser';
2
import { graphqlKoa } from 'graphql-server-koa';
3
import schema from './presentation/schema';
4
​
5
...
6
​
7
router.post(
8
'/api',
9
graphqlKoa(async ctx => {
10
return {
11
schema: schema,
12
context: {},
13
debug: true,
14
};
15
})
16
);
17
​
18
...
19
​
20
// Write the following line before all other app.use(...) calls:
21
app.use(koaBody());
Copied!
CHECK 1: In Postman, making a POST request to localhost:3000/api which content-type is JSON(application/json) with the following raw body:
1
{
2
"query": "{heroes { firstName lastName }}"
3
}
Copied!
...should return our two heroes, Clark and Bruce:
​
    Install Koa graphiQL: yarn add koa-graphiql
    In the index.js file, let our API knows it should use Koa-graphiql:
1
import graphiql from 'koa-graphiql';
2
​
3
...
4
​
5
// Write the following block after others router.verb(...) calls:
6
router.get('/graphiql', graphiql(async (ctx) => ({
7
url: '/api',
8
})));
Copied!
CHECK 2: Hitting localhost:3000/graphiql should return graphiql interface and show the Docs
CHECK 3: Using graphiql interface with the following query:
1
{
2
heroes {
3
firstName
4
lastName
5
}
6
}
Copied!
...should return our two heroes, Clark and Bruce:

Create a business layer (~5min)

This layer will contain all business logic: access controll, scoping / whitelisting, batching and caching and computed properties. More explanations can be found here, in the bam-api repo. In this MO, we will only cover access control logic and batching / caching.
    In a new api/business folder add a new hero.js file describing our class for this business object:
1
const mockedHeroes = [
2
{
3
id: 1,
4
firstName: 'Clark',
5
lastName: 'Kent',
6
},
7
{
8
id: 2,
9
firstName: 'Bruce',
10
lastName: 'Wayne',
11
}
12
];
13
​
14
class Hero {
15
id: number;
16
firstName: string;
17
lastName: string;
18
​
19
constructor(data) {
20
this.id = data.id;
21
this.firstName = data.firstName;
22
this.lastName = data.lastName;
23
}
24
​
25
static async load(ctx, args) {
26
const data = mockedHeroes[args.id];
27
if (!data) return null;
28
​
29
return new Hero(data);
30
}
31
​
32
static async loadAll(ctx, args) {
33
const data = mockedHeroes;
34
​
35
return data.map(row => new Hero(row));
36
}
37
​
38
}
39
​
40
export default Hero;
Copied!
    In our previous presentation/schema.js file, modify our mocked resolvers to use our business layer:
1
+import Hero from '../business/hero';
2
​
3
type Query {
4
heroes: [Hero]
5
+ hero(id: Int!): Hero
6
}
7
​
8
const resolvers = {
9
Query: {
10
- heroes: () => ([
11
- {
12
- firstName: 'Clark',
13
- lastName: 'Kent',
14
- },
15
- {
16
- firstName: 'Bruce',
17
- lastName: 'Wayne',
18
- }
19
- ]),
20
+ heroes: async (_, args, ctx) => Hero.loadAll(ctx, args),
21
+ hero: async (_, args, ctx) => Hero.load(ctx, args),
22
}
23
}
Copied!
CHECK 1: Using graphiql interface with the following query:
1
{
2
heroes {
3
id
4
firstName
5
lastName
6
}
7
}
Copied!
...should return our two heroes, Clark and Bruce.
CHECK 2: Using graphiql interface with the following query:
1
{
2
hero(id:0) {
3
id
4
firstName
5
lastName
6
}
7
}
Copied!
...should return Clark Kent with its id: 1.
CHECK 3: Using graphiql interface with the following query:
1
{
2
hero(id:1) {
3
id
4
firstName
5
lastName
6
}
7
}
Copied!
...should return Bruce Wayne with its id: 2.

Seed our database (~8min)

    Install knex and pg at the root of the project: cd .. && yarn add knex pg
    At the root of our project, add a knexfile.js file:
1
module.exports = {
2
development: {
3
client: 'pg',
4
connection: {
5
host: 'localhost',
6
port: 5431,
7
user: 'heroesuser',
8
password: 'heroespassword',
9
database: 'heroesdb',
10
},
11
migrations: {
12
directory: './api/db/migrations',
13
},
14
seeds: {
15
directory: './api/db/seeds',
16
},
17
},
18
};
Copied!
    Create a migration file: yarn knex migrate:make add_heroes_table and complete the new created file with this:
1
exports.up = function(knex, Promise) {
2
return knex.schema.createTableIfNotExists('Heroes', function(table) {
3
table.increments('id');
4
table.string('firstName');
5
table.string('lastName');
6
table.string('heroName');
7
});
8
};
9
​
10
exports.down = function(knex, Promise) {
11
return knex.schema.dropTableIfExists('Heroes');
12
};
Copied!
    Create a seed file: yarn knex seed:make heroes and complete the new created file with this:
1
exports.seed = function(knex, Promise) {
2
return knex('Heroes').del()
3
.then(function () {
4
return knex('Heroes').insert([
5
{id: 1, firstName: 'Clark', lastName: 'Kent', heroName: 'Superman'},
6
{id: 2, firstName: 'Bruce', lastName: 'Wayne', heroName: 'Batman'},
7
{id: 3, firstName: 'Peter', lastName: 'Parker', heroName: 'Spiderman'},
8
{id: 4, firstName: 'Susan', lastName: 'Storm-Richards', heroName: 'Invisible Woman'},
9
]);
10
});
11
};
Copied!
    Run the migration and the seed: yarn knex migrate:latest && yarn knex seed:run
CHECK 1: You can access the db and prompt content of the Heroes table: docker-compose exec db psql -U heroesuser -d heroesdb then inside the container: select * from "Heroes";;
1
id | firstName | lastName | heroName
2
----+-----------+----------------+-----------------
3
1 | Clark | Kent | Superman
4
2 | Bruce | Wayne | Batman
5
3 | Peter | Parker | Spiderman
6
4 | Susan | Storm-Richards | Invisible Woman
7
(4 rows)
Copied!
Exit with: CTRL-D

Create a db layer with knex (~6min)

This layer let our API query the data using knex query builder.
    Install knex and pg in our API: cd api && yarn add knex pg
    In the api/db folder add a new index.js file:
1
import knex from 'knex';
2
​
3
export default knex({
4
client: 'pg',
5
connection: {
6
host: process.env.DB_HOST,
7
user: process.env.POSTGRES_USER,
8
password: process.env.POSTGRES_PASSWORD,
9
database: process.env.POSTGRES_DB,
10
},
11
debug: true,
12
});
Copied!
    In a new api/db/queryBuilders subfolder, create a new hero.js file and add these few methods to query our data:
1
// @flow
2
import db from '..';
3
​
4
class Hero {
5
static async getById(id: number) {
6
return db
7
.first()
8
.table('Heroes')
9
.where('id', id);
10
}
11
​
12
static async getByIds(ids: Array<number>) {
13
return db
14
.select()
15
.table('Heroes')
16
.whereIn('id', ids);
17
}
18
​
19
static async getAll() {
20
return db
21
.select()
22
.table('Heroes');
23
}
24
}
25
​
26
export default Hero;
Copied!
    Modify the api/db/queryBuilders/hero.js file in our business layer this way:
1
-const heroes = [
2
- {
3
- id: 0,
4
- firstName: 'Clark',
5
- lastName: 'Kent',
6
- },
7
- {
8
- id: 1,
9
- firstName: 'Bruce',
10
- lastName: 'Wayne',
11
- }
12
-];
13
+import HeroDB from '../db/queryBuilders/hero';
14
​
15
class Hero {
16
​
17
static async load(ctx, args) {
18
- const data = heroes[args.id];
19
+ const data = await HeroDB.getById(args.id);
20
if (!data) return null;
21
​
22
return new Hero(data);
23
}
24
​
25
static async loadAll({ authToken, dataLoaders }) {
26
- const data = heroes;
27
+ const data = await HeroDB.getAll();
28
​
29
return data.map(row => new Hero(row));
30
}
Copied!
CHECK 1: Using graphiql interface with the following query:
1
{
2
hero(id:1) {
3
id
4
firstName
5
lastName
6
}
7
}
Copied!
...should return Clark Kent with its id: 1.
CHECK 2: Using graphiql interface with the following query:
1
{
2
heroes {
3
id
4
firstName
5
lastName
6
}
7
}
Copied!
...should return all 4 heroes of our database.

Add association to our API (~6min)

Association are made both in our db and in our API, in our presentation layer.
    Create a new migration: cd .. && yarn knex migrate:make add_heroes_enemies
    Complete the newly created migration file with this:
1
exports.up = function(knex, Promise) {
2
return knex.schema.table('Heroes', function(table) {
3
table.integer('enemyId').references('id').inTable('Heroes');
4
});
5
};
6
​
7
exports.down = function(knex, Promise) {
8
return knex.schema.table('Heroes', function(table) {
9
table.dropColumn('heroName');
10
});
11
};
Copied!
    Modify our api/db/seeds/heroes.js seeds:
1
exports.seed = function(knex, Promise) {
2
return knex('Heroes').del()
3
.then(function () {
4
return knex('Heroes').insert([
5
{id: 1, firstName: 'Clark', lastName: 'Kent', heroName: 'Superman', enemyId: 2},
6
{id: 2, firstName: 'Bruce', lastName: 'Wayne', heroName: 'Batman', enemyId: 1},
7
{id: 3, firstName: 'Peter', lastName: 'Parker', heroName: 'Spiderman'},
8
{id: 4, firstName: 'Susan', lastName: 'Storm-Richards', heroName: 'Invisible Woman'},
9
]);
10
});
11
};
Copied!
    Run these migrations: yarn knex migrate:latest && yarn knex seed:run
    In our business layer, modify api/business/hero.js this way:
1
class Hero {
2
id: number;
3
firstName: string;
4
lastName: string;
5
+ heroName: string;
6
+ enemyId: number;
7
​
8
constructor(data) {
9
this.id = data.id;
10
this.firstName = data.firstName;
11
this.lastName = data.lastName;
12
+ this.heroName = data.heroName;
13
+ this.enemyId = data.enemyId;
14
}
Copied!
    In our API, in our presentation layer, modify our api/presentation/schema.js:
1
const typeDefs = [`
2
type Hero {
3
id: Int
4
firstName: String
5
lastName: String
6
+ heroName: String
7
+ enemy: Hero
8
}
9
...
10
`];
11
​
12
const resolvers = {
13
Query: {
14
...
15
},
16
+ Hero: {
17
+ enemy: async (hero, args, ctx) => Hero.load(ctx, {id: hero.enemyId}),
18
+ },
19
}
Copied!
CHECK 1: Using graphiql interface with the following query:
1
{
2
hero(id:1) {
3
id
4
firstName
5
lastName
6
heroName
7
enemy {
8
heroName
9
}
10
}
11
}
Copied!
...should return Clark Kent with its heroName and its enemy: Batman.

Push your API to the next level: use caching with Dataloader (~6min)

Trying to query heroes and their enemies'heroName will show up a N+1 problem. Indeed, our API make 5 round-trips to our database! Try yourself:
1
{
2
"query": "{heroes { id firstName lastName heroName enemy { heroName } }}"
3
}
Copied!
We can reduce these calls adding caching to our business layer
    Install Dataloader: cd api && yarn add dataloader
    Add a getLoaders method to our api/business/hero.js file in our business layer:
1
import DataLoader from 'dataloader';
2
​
3
​
4
class Hero {
5
//...
6
​
7
static getLoaders() {
8
const getById = new DataLoader(ids => HeroDB.getByIds(ids));
9
const primeLoaders = (heroes) => {
10
heroes.forEach(hero =>
11
getById.clear(hero.id).prime(hero.id, hero))
12
;
13
};
14
return { getById, primeLoaders };
15
}
16
//...
17
}
Copied!
    In our api/index.js file, add a new dataloader to our context for each query on /api route:
1
+import Hero from './business/hero';
2
​
3
router.post(
4
'/api',
5
graphqlKoa(async ctx => {
6
return {
7
schema: schema,
8
+ context: {
9
+ dataLoaders: {
10
+ hero: Hero.getLoaders(),
11
+ }
12
+ },
13
debug: true,
14
};
15
})
16
);
Copied!
    Back in our api/business/hero.js business layer file, modify load and loadAll methods to use our dataloader:
1
static async load(ctx, args) {
2
+ const data = await ctx.dataLoaders.hero.getById.load(args.id);
3
if (!data) return null;
4
​
5
return new Hero(data);
6
}
7
​
8
static async loadAll(ctx, args) {
9
const data = await HeroDB.getAll();
10
+ ctx.dataLoaders.hero.primeLoaders(data);
11
​
12
return data.map(row => new Hero(row));
13
}
Copied!
    Protect loader.load() function call if no argument is supplied:
1
static async load(ctx, args) {
2
+ if (!args.id) return null;
3
const data = await ctx.dataLoaders.hero.getById.load(args.id);
4
if (!data) return null;
5
​
6
return new Hero(data);
7
}
Copied!
CHECK 1: Using graphiql interface with the following query:
1
{
2
heroes {
3
id
4
firstName
5
lastName
6
heroName
7
enemy {
8
heroName
9
}
10
}
11
}
Copied!
...should return all heroes and their enemies and your terminal should prompt only one request to the DB.
CHECK 2: Using graphiql interface with the following query:
1
{
2
h1: hero(id:1) {
3
id
4
firstName
5
lastName
6
heroName
7
enemy {
8
heroName
9
}
10
}
11
h2: hero(id:2) {
12
id
13
firstName
14
lastName
15
heroName
16
enemy {
17
heroName
18
}
19
}
20
}
Copied!
...should return Clark Kent and Bruce Wayne; and only one SELECT call should have beeen made to our DB.

Add access control to our API (~5min)

This is a very simple example, for a more advanced solution, prefer using Koa Jwt.
    In a new api/utils.js file, add these two methods to parse Authorization header and verify token:
1
export const parseAuthorizationHeader = (req) => {
2
const header = req.headers.authorization;
3
​
4
if (typeof header === 'undefined' || header === 'null') {
5
return null;
6
}
7
​
8
const [, scheme, token] = (/(\w+) ([\w.-]+)/g).exec(header);
9
​
10
return token;
11
};
12
​
13
// Not production-ready: this is a simple example for the tutorial
14
export const verifyToken = token => new Promise((resolve, reject) => {
15
if (token !== 'authorized') {
16
const error = new Error('UNAUTHORIZED');
17
error.code = 401;
18
reject(error);
19
}
20
resolve();
21
});
Copied!
    In our api/index.js file, parse authorization header and pass it to our context:
1
+import { parseAuthorizationHeader } from './utils';
2
​
3
router.post(
4
'/api',
5
graphqlKoa(async ctx => {
6
return {
7
schema: schema,
8
context: {
9
+ authToken: parseAuthorizationHeader(ctx.req),
10
dataLoaders: {
11
hero: Hero.getLoaders(),
12
}
13
},
14
debug: true,