I made an unfinished VR demo for the js13k Games 2025 competition, a game jam where you have to make a web game that fits under 13 KB. I aimed for the WebXR category, where you get to use a provided library/engine without it counting for the size limit. I picked PlayCanvas this time, and I want to share my findings here.
About the game
The theme of the competition was “Black Cat”. js13kGames has a character, Badlucky, who is a black cat. I’ve seen videos of cats chasing laser pointer lights. So I decided to make a VR game where you have Badlucky chase around and jump at laser pointer lights coming from your VR controllers. There’s no goal at this point in time, but the laser pointers and cat chasing behavior are working.
I would post a short gif or video of the game here, but I couldn’t figure out screen mirroring (more on that later).
You can play the game yourself here – https://js13kgames.com/2025/games/laser-pointer-cat – but you need a VR headset to play it with the intended controls. There are desktop controls (WASD + arrow keys, mouse click), but they’re very rough.
Setup with TypeScript
In the WebXR category, whatever choice you go with (A-Frame, Babylon.js, Three.js, PlayCanvas), you cannot load external assets or code not already included. PlayCanvas, if you’re familiar with it, has a web editor, but the core engine itself is open source.
In your submission, you have to use the PlayCanvas engine hosted by js13kGames themselves, so in your final build, you can include a link to the engine. I wanted to use TypeScript, and thanks to this template by Matt McKenna, I forked a version with edits to include PlayCanvas with all the typings working, which you can find here – https://github.com/svntax/js13k-webpack-typescript-playcanvas-starter
The project at this point is already at around 1.2 KB, so there’s plenty of space left for an actual game. But from this point on is where I encountered some obstacles with PlayCanvas.
No physics engine
PlayCanvas’s core engine does not include ammo.js, the physics engine you’d normally see in their web editor. If you check out the ammo.js builds, even the minified files are easily over 400 KB. So what does this mean?
No rigidbody system. Anything that uses the RigidBody and Collision components for physics (things like applying forces, impulses, friction, damping, etc.) will not run.
No default font loading

Fonts are assets in PlayCanvas. At least, that’s the typical way of working with fonts. In the web editor, you can just drag-and-drop a .ttf file, and it’ll create the font asset for you. With the standalone core engine, though, you have to import these font assets. They’re two files – a .json file and a .png file – both easily passing the 13 KB limit. See for yourself in PlayCanvas’s example projects – https://github.com/playcanvas/engine/tree/main/examples/assets/fonts
Instead, I ended up using CanvasFont. I couldn’t find anything about this class in the official docs nor the API reference. But there’s a sample project about emojis that uses CanvasFont here – https://playcanvas.vercel.app/#/user-interface/text-emojis
So there’s a bit more setup needed to generate fonts that you can use in your game. Fortunately, this extra code doesn’t take up much space, so this isn’t as big of a problem once you make a helper function or class to easily create canvas fonts.
General 3D and VR limits
This section isn’t completely specific to PlayCanvas, but I wanted to try out enough features someone may use in a VR game – 3D models, images, raycasting, screen mirroring – to get a better idea of what it would take for future js13k competitions.
3D Models
import fencePostModel from "../assets/fence_post.glb";
// other imports
// Create fence posts around perimeter
const fencePostModelFileData = {
url: fencePostModel,
filename: fencePostModel
};
const fenceAsset = new pc.Asset(fencePostModel, "container", fencePostModelFileData, null, null);
fenceAsset.once("load", function (containerAsset: pc.Asset) {
const initialPos = new pc.Vec3(0, -1, 0);
const rot = new pc.Vec3();
const scale = new pc.Vec3(1, 1, 1);
const fence = new FencePost(initialPos, rot, scale, containerAsset.resource as pc.ContainerResource);
// Other code to generate a fence around the sides
});
app.assets.add(fenceAsset);
app.assets.load(fenceAsset);
You can definitely load .glb files. They have to be very simple geometry, though. For example, the fence post model I made in Blender and exported was around 4.4 KB (before compression in the final build), but I’m not a Blender expert or 3D modeler, so there are likely better formats or editors to use for custom meshes. There might even be 3D tools out there made specifically for js13kGames. I’ll probably look into that in the future.
Images and Textures
These are pretty simple to add. Use app.assets.loadFromUrl()
and the callback function to edit an entity’s render material. The game’s grassy floor is just a 2×2 pixel 96 bytes png file, with 2 shades of green. Here’s a snippet of code for the floor in my game:
Edit 2025-09-25: Changed the code to use app.assets.loadFromUrl() instead of manually creating the Texture, making this much shorter.
import floorImageFile from "../assets/floor.png";
/// other imports
// Floor
const FLOOR_WIDTH = 8;
const FLOOR_LENGTH = 8;
app.assets.loadFromUrl(floorImageFile, "texture", function (err: string | null, asset: pc.Asset) {
const floorTexture = asset.resource as pc.Texture;
floorTexture.minFilter = pc.FILTER_NEAREST;
floorTexture.magFilter = pc.FILTER_NEAREST;
const floor = createBox(0, -1.5, 0, FLOOR_WIDTH + 1, 1, FLOOR_LENGTH + 1);
floor.name = "Floor";
const floorMaterial = floor.render.material as pc.StandardMaterial;
floorMaterial.diffuse = new pc.Color(0.3, 0.81, 0.43);
floorMaterial.diffuseMap = floorTexture;
floorMaterial.diffuseMapTiling = new pc.Vec2(8, 8);
});
Raycasting from your VR controllers
I used a Quest 2 during development. There’s a very specific angle offset that PlayCanvas’s built-in controller raycasts have, and I had to adjust the cylinder entities for the controllers to match them. What I don’t know, is if this angle offset is different for other headset controllers. Maybe this is a solved problem and I just missed something from the API, but it’s definitely something to keep in mind if you want to do raycasting and have it be consistent across different devices.
Mirroring your VR screen to the desktop
I don’t know if this is an issue specific to PlayCanvas, but I couldn’t figure out how to do this in time for the deadline. I did find a solution that kind of worked here – https://forum.playcanvas.com/t/webxr-screen-mirroring/32909 – but the most I could get working was having both eyes rendered on the left and right halves of your browser canvas, and then have your VR screen stuck. Imagine you printed a screenshot of the game and stuck it in front of your face. That’s what it looked like. So unfortunately, no screen mirroring for the game.
I think there are tools out there to mirror a Quest 2 to your desktop, but that seems like a messy setup, having to depend on a 3rd party tool when you can already record your web browser.
Other games & What’s next
All entries for js13kGames need to include their source code, so here’s mine – https://github.com/svntax/js13k-2025
I’m much better prepared for next year’s competition, if I decide to try the WebXR category again. Apparently, my game is only the third ever PlayCanvas game in all of js13kGames history. The first two games were:
Storage Space 13 by Sorskoot – https://js13kgames.com/2021/games/storage-space-13
Bitty Jump by mibix2 – https://js13kgames.com/2024/games/bitty-jump
I’ve actually made two WebXR games for js13kGames before, but only one was submitted in time.
Hookshot Fishing, made with A-Frame, 2019 – https://js13kgames.com/2019/games/hookshot-fishing
(Unfinished) Sandbox Digging Game, also made with A-Frame, 2020 – https://arcade.svntax.com/games/sandbox-digging-game-vr/
I hope this blog post helps anyone who wants to try out PlayCanvas for WebXR. And thanks to End3r and the js13kGames team and judges for this yearly competition!