Last night and today I fixed some of the packaging issues with the help of some build util scripts.
node_modules
for every Lambda function
I added a package.json
for every Lambda function, so I would not have to copy the back-ends base node_modules
folder every time I changed something.
Then I use this script to run npm i
in every Lambda function directory:
const { readdirSync, statSync } = require("fs");
const { join, resolve } = require("path");
const { execSync } = require("child_process");
const functionsDir = resolve(__dirname + "/../functions/");
const dirs = p =>
readdirSync(p).filter(f => statSync(join(p, f)).isDirectory());
console.log("Running 'npm i' for Lambda functions:");
dirs(functionsDir)
.map(d => {
console.log(" - " + d);
return d;
})
.map(d => join(functionsDir, d))
.forEach(d => execSync(`cd ${d} && npm i`));
Shared code between the Lambda functions
I added a shared
folder that gets copied to every Lambda function. Not very pretty but it works right now.
const { readdirSync, statSync } = require("fs");
const { join, resolve } = require("path");
const { ncp } = require("ncp");
const sharedDir = resolve(__dirname + "/../shared/");
const functionsDir = resolve(__dirname + "/../functions/");
const dirs = p =>
readdirSync(p).filter(f => statSync(join(p, f)).isDirectory());
console.log("Copy shared code to Lambda Functions:");
dirs(functionsDir)
.map(d => {
console.log(" - " + d);
return d;
})
.map(d => join(functionsDir, d, "shared"))
.forEach(d => ncp(sharedDir, d));
Kappa for async Lambda functions
I only share one helper, called it kappa, at the moment, which solves one of the problems I had, the basic Lambda function interaction.
const lambda = require("../index.js");
exports.handler = (event, context, callback) =>
lambda(event, context)
.then(r => {
if (r && r.body) r.body = JSON.stringify(r.body);
callback(null, r);
})
.catch(callback);
This function becomes the actual Lambda handler and lets you write your Lambda function async. It also stringifies the body of a response if it exists.
exports.handler = (event, context, callback) => {
callback(null, JSON.stringify({result: event.body + "!"}));
}
Now becomes
module.exports = async (event, context) => ({
result: event.body + "!"
});
Pusher Application State
Then, after all the maintainence was done I tried to get some game/channel joining logic on the server going.
The idea was, when someone wants to join a channel, they have to request a channel-ID via GET /getgamechannel
.
The server will calculate one based on the open channels and how many players are inside of them
This /getgamechannel
endpoint, together with the /pusherauth
endpoint, should later keep players from joining full or already running games.
I read that I could query the application state of my Pusher app on the server side, but somehow this doesn't work consistently.
pusher.get({
path: "/channels",
params: {
info: ["user_count"]
}
});
My clients get presence updates from Pusher and I see every client on every client, but when I run this query on the back-end, often get an empty list of channels
or something.
Either the application state is eventually consistent (but I don't know when?) or I have some error in my code, which wouldn't be too surprising :D
Conclusion
Anway, a fun project and I already learned much about Pusher and AWS :)