• Development

Migrating 40k users from Hoodie / CouchDB to Graphcool

At Prototypo, we had an old backend system, that wasn’t flexible, or should I say too much flexible? So when I joined the team, I had the charge to rewrite the payment system with externals AWS Lamda. I didn’t talk that much about it but I dropped some thoughts on the article about Lambda and API Gateway.

But as we were adding features and all, we felt the need to refactor the rest. I guess Hoodie is a great tech when it comes to offline-first capabilities, simple storage like Firebase does and real-time data but we didn’t need all of this. It was used in a weird way that I immediately wanted to fix. Everything was loaded at the first page load and it kept sending data everytime there was a change without anything coming down from the server. We also experienced lots of problems where people had their fonts completely reset to a previous state due to multiple active sessions at the same time.

So our needs were:

  • Link more data to start working on collaborating features
  • Real-time data coming down from the server (again, collaborating features)
  • BaaS (Backend as a Service) to avoid the maintenance pain
  • Extensibility (like webhooks) to support the different

And I was using GraphQL for another project, that seemed to me the thing to use (no hype driven development, it’s just easier to fetch the data we need). It appears that, between all solutions that exists, graph.cool was the way to go. They have done a lot for the community, they open-source pretty much everything, they’re nice and their solution has a lot to offer!

  • Link more data: it’s a GraphQL API (compatible with both Relay and custom solutions like Apollo)
  • Real-time: They support GraphQL Subscriptions
  • “Functions” are a way to transform or react to changes
  • Their project plan is free for open-source (we are \o/)
  • They also have a way to extend GraphQL mutations with functions

And now, let’s dig that migration!

The database was, if I may, very poorly designed. Users are registered under a _users database that contains the basic info like the email, the password and a link to the billing account. Every user is linked to a unique — or so I thought… — database to store its preferences and projects.

Transfering user accounts first

I first decided to transfer every users we had in the database and that was the easiest since CouchDB has a REST endpoint that can dump everything. Just with /_users/_all_docs?include_docs=true. That being set, I had to push them onto GraphCool without bloating the network with 40k requests, that’s when GraphQL comes to be handy, you can batch mutations to avoid sending multiple requests and sending them all at once. I split my users into groups of 100 and send them while checking no errors were found.

To avoid duplicates, I first used the same batch ability to query every users and see which one were missing. So I could use that script multiple times to push the last users that were registered before we push the new code.

  1. Fetching all users

    {
      user1: User(email: "user@something.com") {
        id
      }
    
      user2: User(email: "notyetregistered@something.com") {
        id
      }
    
      ...
    }
    
  2. Filtering the response:

    {
      "data": {
        "user1": { "id": "some_id_returned_by_graphcool" },
        "user2": null,
        ...
      }
    }
    
  3. Sending the new ones:

    mutation {
    
      user2: signUpEmail(
        email: "notyetregistered@something.com",
        password: "password"
        oldSignedUpAt: "${signedUpAt}"
        oldCreatedAt: "${createdAt}"
      ) {
        id
      }
    
      ...
    }
    

And that’s pretty much everything to get everyone transfered.

Users’ fonts, profile and preferences

The biggest challenge was to sync users’ fonts because of the one-database-per-user thing. So if you take a user that have these projects:

  • My First Font
    • Regular
    • Bold
    • Custom variant
  • My Second Font
    • Regular

His database looked like this:

newappvalues/default
newaccountvalues/default

newfontvalues/myfirstfontregular
newfontvalues/myfirstfontbold
newfontvalues/myfirstfontcustomvariant
newfontvalues/mysecondfontregular

The first two documents contains the preferences and the profile values, but we’ll get to this later.

Each variant has its own document stored under family_name + variant_name somewhat sanitized. Now, that makes a terrible problem that I encountered when transferring accounts. What if I’m Japanese and I want to name my font 青空 — because Blue Sky is such a great name — you end up with newfontvalues/regular and worse, if you name your variant the same way newfontvalues/. You end up with an empty document name, that’s pretty bad… So I hope that our non-latin community could forgive us for this and that it should be totally fine from now on!

Font families are stored directly on the user preferences. Every font is just a plain object that has its own variants list pointing to their databases, roughly like this:

{
  "library": [
    {
      "name": "My First Font",
      "template": "venus.ptf",
      "variants": [
        { "name": "Regular", "db": "myfirstfontregular" },
        { "name": "Bold", "db": "myfirstfontbold" },
      ]
    },
    {
      "name": "My Second Font",
      "template": "elzevir.ptf",
      "variants": [
        { "name": "Regular", "db": "mysecondfontregular" }
      ]
    }
  ],
  ...
}

But, as far as I know, we only get users’ databases by querying them one by one… GET /__user_database__/_all_docs?include_all=true.

This time, the script went a bit further: I needed to fetch all users, remove the non-existing common documents from both ends (GraphCool and CouchDB) and query all the data. I made up a small cache system to avoid redownloading the all thing in case of failure. That being done, I could fetch a hundred users’ databases and send in one row the mutations I needed.

I used glouton, a small utility I made before for another use case. It allows you to retry failing requests and define a concurrency limit if you don’t want to send to many requests at the same time. For this migration, the configuration was pretty straightforward:

import fetch from 'node-fetch';
import glouton from 'glouton';

const fetchWithRetry = glouton(fetch, {
  concurrency: 1,

  validateResponse: r => {

    // We continue (see below why)
    if (r.status === 404) {
      return true;
    }

    // if request has failed, we'll try again later
    // you can also return a time (ms)
    // pretty useful for API limits
    if (!r.ok) {
      return 0;
    }

    // everything is ok, we shall proceed
    return true;
  },
});

...

fetchWithRetry(...)
  .then(r => {
    // When the resource is not found,
    // we can return a default value or something else
    // to avoid breaking everything
    if (r.status === 404) {
      return { rows: [] };
    }

    return r.json();
  })
  .then(data => {
    // do what you need
  })

Having everything I needed, I just had to start creating my mutations. The rough part was knowing which variant belong to which font — assuming people could and would rename them — when they were already transfered. Basically, I only saved user_database-document_name into an oldId field that could allowed me to gather them into a family array.

  1. Gather me every variant with their family infos.
  2. Look for the ones that were already migrated
  3. Prepare the variants attributes
    const variantProperties = font.map(variant => `
      name: "${variant.name}",
      oldId: "${user.databaseName}-${variant.documentName}",
      values: "${JSON.stringify(variant.values)}",
    `)
    
  4. Update families and create the missing variants
    gql`
      updateFamily(id: "${familyId}", name: "${familyName}", template: "${familyTemplate}") {
        id
      }
      
      createVariant(
        ${variantProperties}
        familyId: "${familyId}"
      ) {
        id
      }
    `
    
  5. Create the missing families with their variants using nested mutations
    gql`
      createFamily(
        name: "${familyName}",
        template: "${familyTemplate}",
        ownerId: "${user.id}"
        variants: [
          ${
            variantProperties
              .map(properties => `{ ${properties} }`)
              .join(',')
          }
        ]
      ) {
        id
      }
    `
    
  6. Update families and variants if they have been modified

When everything is done, you concatenate all the mutations and send them in one row to the server and keep going through the users. I won’t mention how I did transfer the profile infos and preferences as it is pretty much the same thing but easier.

And that, kids, is how I transfered 40 000 users from one place to another!

  • Anime
  • Non-TV program

Nichijō - 日常

Nichijo Cover

Hey, it’s been a while! Today, I want to talk about anime again. Besides the fact that I finally watched Attack on Titans (and you should), I heard about Nichijō. It appeared to me like a reference or something, so I jumped into it. And what a weird encountering it had been.

The plot… I guess there is no plot. You just take a bunch of people, mainly high school girls and somewhat connected to each other, and watch them live. The episodes are composed with many short scenes that are independents but sometimes references are made to the past. And that’s pretty much anything I can say about it.

If you happen to love absurd scenes like I do, I advise you to watch it right now, you won’t be disappointed. And just for you, here’s a glimpse at my favorite scenes (don’t worry, you can’t really be spoiled).

Note
I think 通 (tooru) means transparent, that’s why she overreacted when hearing the word. But I have no clue for the small/short dilemma.

As a bonus, she gets back there to trick her friend a few episodes later.

And a last very funny one:

If you want more, you could check out the mosquitoes scene, or simply watch everything. ;)

  • Music

5 French Pop / Punk bands you might wanna know about

I don’t really know how is seen the French music scene abroad and some people might want to discover some interesting local bands. As I am (really really) into punk related music, this is the only thing that I’ll share with you now but there’s plenty things to discover everywhere!

Guerilla Poubelle

Punk from Paris. Started in 2003.
That band likes to criticize the living society, that’s why I considered them pure punk (that and the loud guitar and rough voice). They sing mostly in French but it happened they released songs in English. I think this is one of the best French punk reference (they toured all around the world).
They’re part of the Guerilla Asso, which gathers lots of punk artists (even from abroad).

Les 3 Fromages

Funny pop punk from Quiberon, Brittany. Started in 2006.
Originated from Brittany where I come from too, these guys are cool! Litterally “The 3 cheese” (referencing a pizza name in case you didn’t understand :P), they do funny (and sometimes dirty) pop punk rock inspired by Green Day, Offspring and Blink-182. Their most known song is a pray for a cheese-based French dish called “Tartiflette”.

Maladroit

Punk from Paris.
This band is one of the many projects Guerilla Poubelle’s member can have (you can easily recognize one of the singer’s voice). Completely pop punk, their lyrics are funny and the songs catchy. I like them for that and the crazy movie clips!

Sons Of O’Flaherty

Celtic punk from Vannes.
With one of Les 3 Fromages’s member (yeah I know, they all play in each other’s band), this band makes interesting music as they use traditional instruments like tin whistle or even bagpipes. Inspired by bands like Dropkick Murphys or The Pogues, they mix Breton music, folk songs and irish classics.

Poésie Zéro

Punk from outer space.
I wanted to put this band because they have a really funny weird way to do their communication. They’re always pissed off about everything and shout at the public that they’re idiots. Sometimes you think they’re trying to educate you about society, but sometimes they’re just yelling at you. This still makes me laugh.

And you might want to check out also:

  • Development

Setting up a React Native & Web project

Hello guys, it’s been a while! I wanted to share about React Native today. I started a small project with a friend recently and we wanted to be able to develop the mobile app as well as the website. For now, the project is really in a early stage so I won’t say much about it but I can tell about design considerations for those of you who are hesitating. First things first, we needed something easy to manage our data, so we chose Firebase which is a pretty good way of getting something done quickly.

React Native part

To get started, we start with installing and generating a react native project, which is pretty straightforward, just do:

npm install -g react-native && react-native init MyProject

This will create a basic structure that looks like this:

MyProject
├── node_modules
├── android
├── ios
├── // bunch of . files
├── index.android.js
├── index.ios.js
└── package.json

This way, you can already start developing with react-native run-ios or react-native run-android. But now we need our web part, so how can we do that?

Here comes the web

Maybe this last few months you heard about create-react-app, an awesome tool with no configuration to start developing with React on the web. I chose it to get started because it’s really easy and fast. This tool is based on something called react-scripts that manages the building configuration for you. And since we already have a structure for our project, we need to integrate it, and it’s so easy to do.
First, install it into the project by doing:

npm install react-scripts --save-dev

Then, all you need to create is:

  • An entry point for your web project located at src/index.js (no, I’m not wrong about the react-native import, wait for it :P)
    import { AppRegistry, Text } from 'react-native'
    
    AppRegistry.registerComponent('App', () => <Text>Hello World!</Text>)
    AppRegistry.runApplication('App', { rootTag: document.getElementById('app') })
    
  • A public folder with a basic HTML: public/index.html
    <!doctype html>
    <html lang="en">
      <head>
        <title>MyProject</title>
        <meta charset="utf-8">
        <meta name="description" content="MyProject's description">
        <meta name="author" content="Me">
        <meta name="viewport" content="width=device-width, initial-scale=1">
        <meta name="theme-color" content="#000000">
        <link rel="shortcut icon" href="%PUBLIC_URL%/favicon.ico">
      </head>
      <body>
        <div id="app"></div>
      </body>
    </html>
    
  • Add these lines to your package.json
    {
      "scripts": {
        "start": "react-scripts start",
        "build": "react-scripts build",
        "eject": "react-scripts eject"
      }
    }
    

And that’s it, we did the job of create-react-app but manually to integrate it into our project.

Now, you might wonder: Why is there an import from react-native in my web application? And you’re right because it’s weird! The reason is because react-scripts integrates the react-native-web package that allows you to use the component from react-native into the web. Every call to React Native is proxied to React Native Web when the web building part do the job. You don’t need to think about it but keep that in mind because it allows you to share more code between your different platforms. You still can use ReactDOM.render if you prefer.

Last, but not least, components and stuff we’re going to write!

Project final structure

There’s a cool stuff you can do with React Native when importing files. It automatically resolves the platform it is building from the file’s name. Here’s some examples:

// will import ./index.js on the web
//
// will import ./index.native.js on React Native build
// - OR -
// will import ./index.ios.js on iOS build
// will import ./index.android.js on Android build
import './index'

// And it works with folders too!!
// will import ./src/components/MyComponent/index.js on the web
// will import ./src/components/MyComponent/index.native.js on React Native
// etc...
import MyComponent from './src/components/MyComponent'

This feature is so nice and allows you to have a great project structure like this:

MyProject
├── node_modules
├── android
├── ios
├── src
│   ├── components
│   │   ├── ...
│   │   └── MyComponent
│   │       ├── index.js
│   │       ├── index.ios.js
│   │       └── index.android.js
│   ├── containers
│   │   └── MyContainer
│   │       └── index.js
│   └── index.js
├── // bunch of . files
├── index.android.js
├── index.ios.js
└── package.json

Containers like MyContainer will contain common logic and include components from the components folder, no need to differentiate inside the container.

Bonus: React Router (v4)

You can easily use React Router with React Native to manage your routes inside your application the same way you do in the web app. If you have a <Root /> component that use a <Router /> to manage its routes, you can easily import the MemoryRouter or BrowserRouter according to the platform you target.

// src/index.js
import { Match, BrowserRouter as Router } from 'react-router'

// index.native.js - OR - index.android.js and index.ios.js
import { Match, MemoryRouter as Router } from 'react-router'

Hope this article can give some tips to people who aren’t aware of this nice features and see you next time! :D

  • Development

REST API made easy with Apex, AWS Lambda and AWS API Gateway

Update September, 4th: Adding an example on how to configure CORS for your lambdas.

Recently, I was asked to rewrite the backend part of the company I’m currently working for with AWS services and Apex, an utility used to facilitate the deployment of AWS Lambda functions. In short terms, AWS Lambda is a FaaS (Functions as a Service): you code simple functions that get input and return output, that’s it, nothing else. Similarly, it exists hook.io or Google Cloud Functions. When it comes to AWS API Gateway, it is a way of mapping your Lambda functions to endpoints, it can do content type matching, security, and stuff you often repeat in your code. So let’s dive in!

An AWS Lambda function can be represented like this:

function(event, context, callback) {}

You can use either the context methods success and fail or the callback method whose first argument is error and second is the response. We can now dive into the Apex world that makes everything Lambda-related easier.

Apex

Start with downloading and installing APEX CLI from apex.run, the website is very useful and well-explained.

  1. You get the software
  2. You run apex init in a new project folder
  3. You get a structure like this :
    project.json
    functions
    ├── bar
    │   ├── function.json
    │   └── index.js
    └── foo
        ├── function.json
        └── index.js
    
  4. You deploy your functions with apex deploy!

And that’s it, boom, done, you have functions in the cloud! To use and test them, you just have to call them with apex invoke <function name>.

Bonus for Node.js users

You can npm install apex.js, a nice package that let you express your functions using Promise and forget about try { ... } catch { ... } blocks.

From there :

export default ({ name }, ctx, cb) => {
  try {
    // doing Promise stuff and throwing errors
    cb(null, `Hello ${name}`);
    // or ctx.succeed(`Hello ${name}`)
  } catch(err) {
    cb(err);
    // or ctx.fail(err)
  }
}

You get there:

import λ from 'apex.js'

export default λ(({ name }) => {
  // doing Promise stuff and throwing errors
  return `Hello ${name}`
})

API Gateway

Definitions and deployment

Now let’s talk about API Gateway because it’s not (yet?) integrated in Apex, so… pretty hard to set up with code you can auto-deploy. But, going deep in Apex issues, you can find some odd Python script that let you express swagger definition along with your functions. Let’s look at an example:

// function.json
{
  "description": "Say hello to a given name",
  "x-api-gateway": {
    "method": "get",
    "path": "/hello/{name}",
    "parameters":[{
      "name": "name",
      "in": "path",
      "description": "Name of the person we want to say hello",
      "required": true,
      "type": "string"
    }]
  }
}

This file is originally used by Apex to configure your lambda function precisely. But you can add more to the definition and that’s why we put the swagger definition here, very convenient way of doing things. So you got this x-api-gateway which is a swagger extension AWS is using to add parameters to the API. Currently the Python script is not really flexible about everything you can do with AWS Swagger extensions. You can check out everything on the AWS docs to extend your Swagger interface.

Now about the main and most complex changes you need to have in your project.json:

"x-api-gateway": {
  "base_path": "/api",
  "stage_name":"dev",
  "rest-api-id":"<rest-api-id>",
  "swagger-func-template": {
    "consumes": ["application/json"],
    "produces": ["application/json"],
    "responses": { /* mapping HTTP codes to schemas */ },
    "x-amazon-apigateway-integration": {
      "responses": { /* mapping responses to HTTP codes */ },
      "requestTemplates": { /* See below, this one needs a loooong explanation */ },
      "uri": "arn:aws:apigateway:<region>:lambda:path/2015-03-31/functions/arn:aws:lambda:<region>:<account_id>:function:{{functionName}}/invocations",
      "credentials":"arn:aws:iam::<account_id>:role/APIGatewayLambdaInvokeRole",
      "passthroughBehavior": "when_no_match", /* This thing is important, I spent a lot of time because of it */
      "httpMethod": "{{functionMethod}}",
      "type": "aws"
    },
    "x-amazon-apigateway-auth" : { /* Everything security related */ }
  }
}

I’ve shorten everything but there is a link to a boilerplate at the end of the article that contains a more complete example file. First things first, you need to create a REST API on API Gateway (tip: you can use AWS CLI like this aws apigateway create-rest-api 'My Awesome API'), this will gives you an id you need to put on your configuration file.
swagger-func-template is kind of global configuration for every function. For more information on how to define responses and stuff, you can check out on AWS docs. Let met explain in a list what’s interesting here:

  • uri: this property needs to be filled with the arn(s) Amazon gives you, but no worries, you can fill it by hands. The {{functionName}} parameter will be automatically replaced with the function name, leave it like this.

  • credentials: I didn’t search that much for this one, but replace account_id and it should work. :)

  • requestTemplates: This is a mixed format template from VTL and JSON Path used to transform the shape of the input data to transfer it to your lambda. Currently my configuration looks like this:

    "application/json": "{\n   \"method\": \"$context.httpMethod\",\n   \"body\" : $input.json('$'),\n   \"headers\": {\n     #foreach($param in $input.params().header.keySet())\n     \"$param\": \"$util.escapeJavaScript($input.params().header.get($param))\" #if($foreach.hasNext),#end\n \n     #end\n   },\n   \"queryParams\": {\n     #foreach($param in $input.params().querystring.keySet())\n     \"$param\": \"$util.escapeJavaScript($input.params().querystring.get($param))\" #if($foreach.hasNext),#end\n \n     #end\n   },\n   \"pathParams\": {\n     #foreach($param in $input.params().path.keySet())\n     \"$param\": \"$util.escapeJavaScript($input.params().path.get($param))\" #if($foreach.hasNext),#end\n \n     #end\n   }\n}"
    

    Okay okay, I know this doesn’t look good but what if I do this:

    {
      "method": "$context.httpMethod", // GET, POST, PUT...
      "body" : $input.json('$'), // your payload
      "headers": { // Content-Type and stuff like that
        #foreach($param in $input.params().header.keySet())
          "$param": "$util.escapeJavaScript($input.params().header.get($param))" #if($foreach.hasNext),#end
        #end
      },
      "queryParams": { // for example /sweets?sort=asc will gives you a sort property into queryParams
        #foreach($param in $input.params().querystring.keySet())
          "$param": "$util.escapeJavaScript($input.params().querystring.get($param))" #if($foreach.hasNext),#end
        #end
      },
      "pathParams": { // for example /sweets/{name} will gives you a name property into pathParams
        #foreach($param in $input.params().path.keySet())
          "$param": "$util.escapeJavaScript($input.params().path.get($param))" #if($foreach.hasNext),#end
        #end
      }
    }
    

    Better, huh? And we can see that’s a kind of enhanced JSON used with $ variables representing input. What we are doing here is mapping method, body, headers, query parameters and the path parameters into their own property in an object that will be sent to our lambda. You can even add hardcode properties if you need to. This structure is really opinionated : one one hand, it’s really convenient because everything is well-separated, but on the other hand, your lambda needs to know where is the parameters it needs. It’s your choice to define it like this or putting every properties into the main object. You could also writing it differently in every function.json.

  • passthroughBehavior: This one is very important, it represents the way Amazon will call your lambda using the requestTemplates.

    • when_no_match: Mapping body with requestTemplates and if no content type is matched, content passes through as-is.
    • when_no_template: same as when_no_match when templates are defined, but if no templates, it passes through as-is.
    • never: rejects the method request if the content-type doesn’t match anything in mapping template.

    This option made me go crazy during an hour, I had my request mapped only when I was not sending any body (it was set on the never option).

Now that everything is settled, you can deploy using the script:
python api-gateway-deployer/src/__init__.py project.json

Apex API Gateway

Python may not be what your co-worker wants to install for a tiny simple script… That’s why I rewrote and enhanced this in Node.js, less code, no Python required and more flexibility coming if you or other people gets interested.

A simple npm install -gapex-api-gateway and here you go. You can also install it locally in your project and call it from an NPM script.

So you might first want to create an API:

apex-api-gateway create 'My Awesome API'

This will add a rest-api-id field in your project.json that can be used later to update your Swagger schema. This way, you don’t need to repeat yourself with the AWS CLI. And as we talk about updating, here’s how:

apex-api-gateway update

Now you can develop and deploy without bothering about AWS user interface, you only need, of course, appropriate security roles.

Adding CORS

Warning: From now on, if you follow this instructions, the configuration won’t work with the Python script since apex-api-gateway has more advanced features required for this.

I struggled a lot to add CORS to my lambdas, so here’s the way to deal with it for anyone annoyed. First, what you need is replying to preflight request, the OPTIONS request that many browsers send before the real request to validate the access to a resource. So, let’s add what you need in the project.json:

{
  /* ... */
  "x-api-gateway": {
    "paths": {
      ".+": {
        "options": {
          "summary": "CORS support",
          "description": "Enable CORS by returning correct headers\n",
          "consumes": ["application/json"],
          "produces": ["application/json"],
          "tags": ["CORS"],
          "x-amazon-apigateway-integration": {
            "type": "mock",
            "requestTemplates": { "application/json": "{\n \"statusCode\" : 200\n}\n" },
            "responses": {
              "default": {
                "statusCode": "200",
                "responseParameters": {
                  "method.response.header.Access-Control-Allow-Headers": "'Content-Type,X-Amz-Date,Authorization,X-Api-Key'",
                  "method.response.header.Access-Control-Allow-Methods": "'GET, PUT, POST, DELETE'",
                  "method.response.header.Access-Control-Allow-Origin": "'*'"
                },
                "responseTemplates": { "application/json": "{}\n" }
              }
            }
          },
          "responses": {
            "200": {
              "description": "Default response for CORS method",
              "headers": {
                "Access-Control-Allow-Headers": { "type": "string" },
                "Access-Control-Allow-Methods": { "type": "string" },
                "Access-Control-Allow-Origin": { "type": "string" }
              }
            }
          }
        }
      }
    },
    /* ... */
  }
  /* ... */
}

The paths key has been added in the 0.2.0 version of apex-api-gateway to set default methods to every resources matched by a regex. You can find more about it on the paths section of the README. This way, we add an OPTIONS request to every resources we want to be available in the browser. The regex is set to match everything, but you could only allow one part of your API, very flexible this way.

Now that we have defined our preflight requests, we have to add the same headers to all the responses we want. And for now, we do it in swagger-func-template, the behavior of this key is very similar to paths except you don’t match paths and methods. In the future, it would likely be merged in paths property with a similar behavior for methods.

"swagger-func-template": {
  /* ... */
  "responses": {
    "200": {
      /* ... */
      "headers": {
        "Access-Control-Allow-Headers": { "type": "string" },
        "Access-Control-Allow-Methods": { "type": "string" },
        "Access-Control-Allow-Origin": { "type": "string" }
      }
    },
    /* ... */
  },
  "x-amazon-apigateway-integration": {
    "responses": {
      "default": {
        "statusCode": "200",
        "responseParameters": {
          "method.response.header.Access-Control-Allow-Headers": "'Content-Type,X-Amz-Date,Authorization,X-Api-Key'",
          "method.response.header.Access-Control-Allow-Methods": "'GET, PUT, POST, DELETE'",
          "method.response.header.Access-Control-Allow-Origin": "'*'"
        }
      },
      /* ... */
    },
    /* ... */
  },
  /* ... */
}

And that’s it, now you still can override per method your configuration if you need to.

Conclusion

Thanks for reading this, you can check out my Apex API Gateway boilerplate on GitHub (Node.js), it has (almost) everything you need to get you started! Tell me about the apex-api-gateway script and/or don’t hesitate to file issues or contribute on the GitHub repository

PS: A friend told me about ClaudiaJS, I never tried it nor saw it before, maybe it is a good alternative for your Lambda functions being written the express.js way.

  • Me

The Renewal

Hello everyone, this blog is coming back! This doesn’t mean it is completely functional right now, but previous posts are coming in the next couple of weeks!

Yeah!

Inserting React components inside your Jekyll markdown posts

Thoughts about relationship deletion in Redux with Normalizr

Forms without refs in React stateless components

Echofon for Firefox : The renewal

Pretty printing all JSON output in Silex PHP

Symfony2 : Protéger ses entités avec les voters

Global Game Jam 2015 - Que ferait MacGyver ?

SuperTurkey - Thanksgiving

Ménage de prin... automne

Informations ISEN CIR3 - L'alternance en Cycle Informatique & Réseaux

Angel Beats! - エンジェルビーツ!

The Internet's Own Boy : The Story of Aaron Swartz

Glass Camp Bank à Brest

Brezeliad - Première séquence gameplay

Le Trailer de Brezeliad ou Le Parcours du Combattant

Une patate volante sur Twitter

Stunfest J-27 - L'aventure continue

L'offre légale chez EA

Stunfest - J-100

Marty - Back To The Future

Steins;Gate - シュタインズ ゲート

Enfin un nom de domaine !

OnHack Reborn !

Une deuxième année bien chargée

Raspberry Pi - Un ordinateur low-cost au goût de framboise

Echofon for Firefox (Twitterfox) is not dead (et espérons qu'elle le restera)

Groar ou l'art de ne jamais finir ses projets

Extensions Gnome-Shell - Requêtes HTTP avec libsoup

Gérer les changements de maps avec MelonJS

Les Mystères de Rennes

Global Game Jam 2013 - Retour d'expérience

Les Enfants Loups, Ame et Yuki - おおかみこどもの雨と雪

Puella Magi Madoka Magica - 魔法少女まどか☆マギカ

Accel World - アクセル・ワールド

Saleté de système de traduction !

Et encore un blog !