Manual texture filtering for pixelated games in WebGL

cover
Left: Linear filtering; middle: nearest neighbor sampling; right: custom texture filter. The knight is part of the Wesnoth Frankenpack. Click here to see texture filter demo in action.

If you’re writing a pixelated game that performs large magnification of textures, you’re probably using the nearest texture filter so your game will look like the knight on the middle instead of the one to the left.

The problem with nearest texel sampling is that it’s susceptible to aliasing if the texels are not aligned with the screen pixels, which can happen if you apply transformations such as rotation and shearing to your textured polygons. Ideally, we would like to have smooth transitions between neighboring texels in the final image, like the knight on the right in the image above.

Continue reading

Advertisements

Custom shaders with Three.JS: Uniforms, textures and lighting

If you’re familiar to WebGL and GLSL programming and have started using three.js, you’ll eventually run into a situation where you want to code your own shader, but at the same time use the resources that this library provides you with. In this post, I’ll show you how to setup a custom shader with a three.js geometry, pass it your own uniforms, bind a texture to a particular uniform and receive all lights that you’ve added to the scene.

Continue reading

Hello, world!

I spent the last few months working on a game project that proved to be too ambitious for the manpower I had available. So we decided to make small games and try to learn as much as possible from this experience, and bring our findings to the public.

In this post I’ll start by revealing the final year project I conducted as a computer science student at Universidade Federal de Minas Gerais (UFMG) back in 2010. Although not related to games, the idea was a system very similar to Oculus Rift, in which you could control a robot instead of a game character.

In the video, what we see is a pair of cameras attached to a setup with two servo motors, which allowed the cameras to rotate in pitch and yaw. The user would equip a headset (not shown in the video) attached to an IMU (a sensor that can estimate orientation in space), and the cameras would be moved to match the user head orientation.