WebGL moon shader

Shows how to create a lit sphere with just one vertex and a custom shader. The demo creates a few point sprites and an animated point light circling around the “spheres”. Saves a lot of geometry!

Of course, if the viewer moves with respect to the geometry, you still see the same texture. So it is perfect for rendering the moon, which always shows the same face to us.

Open demo in new browser tab

To examine the source code, right-click into the browser window, where your browser should offer you to view the source of the current frame.

Cor Leonis 5.0 released

After a looooong break, I picked up development on my astronomy app, Cor Leonis, again. The latest version 5.0 is available now for iOS devices in the Apple App Store. The one big feature which justifies the major version jump: the moon! While I was working on the moon info panel, I also beefed up information about the planets in our solar system a lot. Hope you like it!

New demo: moon shader

When I was thinking about how to put the moon into the 3D view of my astronomy app, I figured it would be a waste to actually display a textured sphere. After all, we always see the same side of the moon. All that is needed is a textured quad, and a shader which emulates a lit sphere. In the end, the quad was reduced to a single vertex – a point sprite. Try it out in the demos section: WebGL moon shader

Animated IFS

This shows a combination of an IFS (iterated function system) and a particle system. In an IFS, points are attracted towards a fractal shape by iterating the positions over a set of affine transformations. The original algorithm starts with just one random point and plots its current position over a series of randomized transformations. In this example, I instead start out with a number of randomly distributed particles, which are then given target positions by running the IFS transformations.

When the general shape has settled, one additional iteration is performed every few seconds. It’s instructive to see how particles are warped from one place on the fractal shape to a completely different position.

The demo is implemented in Processing using processing.js. With 5000 particles moving around, performance depends on your browser and hardware. Unfortunately, PShape support is not yet complete in processing.js, which would probably help to speed this up.

Run demo in a popup

Source

Cut-out shapes and image masking in processing(JS)

In processing, it is not really easy to construct complex 2D geometry by subtracting shapes from each other, i.e., creating cut-outs. You can resort to vertex contours, but if you just want to punch holes into a square, what can you do? Especially, if you want to run in the browser using processing.js, it gets somewhat tricky.

A pixel-based approach is to render the complex shape into an off-screen image, and draw the result transparently onto the display. The example discussed below is about creating a rather simple planet with a ring:

See the full listing at the end for the complete code.

blendMode()

In processing 2, the new blendMode() function can be used to overwrite part of a shape with alpha 0 pixels:

PGraphics planetImage;

void setup()
{
   // blendMode() doesn't work properly with the default renderer, use P2D here
  size(500, 500, P2D);

  // create off-screen buffer with transparent background
  planetImage = createGraphics(200, 200, P2D);
  planetImage.beginDraw();
  planetImage.background(0, 0);
  
  planetImage.translate(100, 100);
  planetImage.rotate(0.5);
  
  // draw circle
  planetImage.noStroke();
  planetImage.fill(255, 220, 0);
  planetImage.ellipse(0, 0, 200, 100);

  // replace part of the circle with alpha 0 - "make a hole"
  planetImage.blendMode(REPLACE);
  planetImage.fill(255, 255,  255, 0);
  planetImage.noStroke();
  planetImage.ellipse(0, -5, 160, 70);

  // add "planet"
  planetImage.fill(255, 220, 0, 255);
  planetImage.ellipse(0, 0, 120, 120);

  planetImage.endDraw();
} 

The result can be rendered nicely on top of a background with image(planetImage, …). Unfortunately, blendMode() doesn’t work yet with processing.js, and the blend() function doesn’t allow you to overwrite the alpha channel in the same way as blendMode(REPLACE).

Image masking

Another approach to the problem is to render the shape onto a black background, and then create a mask image to cut out the shape transparently. In processing 2, this can be done using the PImage.mask() method. But again, mask() is not yet supported in processing.js. Instead, we can create a separate mask image, and render our shape in two passes using blend():

PImage maskImage;

void setup()
{
  ...
  
  // create a white on black mask image by thresholding the off-screen image
  maskImage = planetImage.get();
  maskImage.filter(THRESHOLD, 0.1);
}

void draw()
{
  ...
  
  // subtract white planet image - results in a black planet
  blend(maskImage, 0, 0, 200, 200, xPos, yPos, 200, 200, SUBTRACT);
  
  // add colored planet image on top
  blend(planetImage, 0, 0, 200, 200, xPos, yPos, 200, 200, ADD);
} 

If you wanted to use the color black in the solid part of the shape, creation of the mask would have to be modified, obviously. Another disadvantage of this approach is that you cannot transform the images rendered with blend(). I.e., it is usually easy to draw a rotated image using rotate(angle) followed by image(…). With the blend() function, the rotation just doesn’t work.

Phew! So, as promised, here is the complete working example:

PGraphics planetImage;
PImage bgImage;
PImage maskImage;

void setup()
{
size(500, 500);

// create off-screen planet shape with black background (rendered transparently later on)

// create off-screen buffer with black background
planetImage = createGraphics(200, 200);
planetImage.beginDraw();
planetImage.background(0);

planetImage.translate(100, 100);
planetImage.rotate(0.5);

// draw circle
planetImage.noStroke();
planetImage.fill(255, 220, 0);
planetImage.ellipse(0, 0, 200, 100);

// cut out part of the circle
planetImage.fill(0);
planetImage.noStroke();
planetImage.ellipse(0, -5, 160, 70);

// add "planet"
planetImage.fill(255, 220, 0);
planetImage.ellipse(0, 0, 120, 120);

planetImage.endDraw();

// create a white on black mask image by thresholding the off-screen image
maskImage = planetImage.get();
maskImage.filter(THRESHOLD, 0.1);

// draw a background pattern...
background(0, 0, 120);
fill(255);
stroke(255, 255, 255, 50);
strokeWeight(2);
for (int i = 0; i < 100; ++i)
{
float size = random(2, 5);
ellipse(random(width), random(height), size, size);
}

// ... and save it to an image, so we can re-render it easily
bgImage = get();
}

void draw()
{
// draw background pattern
image(bgImage, 0, 0);

int xPos = frameCount % (width + 200) - 200;
int yPos = 100;

// subtract white planet image - results in a black planet
blend(maskImage, 0, 0, 200, 200, xPos, yPos, 200, 200, SUBTRACT);

// add colored planet image on top
blend(planetImage, 0, 0, 200, 200, xPos, yPos, 200, 200, ADD);
}

Processing(JS)

Recently, I dusted off my copy of “The Computational Beauty of Nature” and started rediscovering this still wonderful book. The chapter about IFS fractals inspired me to do some experimenting with animated fractal shapes. An opportune moment to learn more about Processing! After playing with it for a few hours, I have to say, this is a wonderful programming environment for this kind of visual experiments. Virtually no boilerplate code, cumbersome project setup, etc. Just start hacking away on your ideas. Based on Java, it is not a toy language either, so all the usual data structures and OOP constructs are readily available.

An additional treat: With processing.js, it is quite easy to run a processing application in a browser. See the IFS animation demo in the new demo section on the left (hope to add more to that category soon 🙂 ). Support is not complete, though: I had to rewrite my demo to some extent, because processing.js doesn´t support the PShape class very well yet, which is quite essential to get good performance in a particle system demo… so I reduced the visuals a bit. Still, much easier than having to go through and translate everything to JavaScript myself!

Bouncing Ball Physics

To revive my Blender skills, I’ve been tinkering with setting up a simple bouncing ball animation. How do you keyframe this properly, without running a physics simulation? There are tons of tutorials on the web on basic bouncing ball demos, but few go into details about what a physically plausible bouncing ball trajectory would look like. As it turns out, with an ideal bouncing ball, there are only a few basic ingredients:

  • The path is obviously a series of parabolas
  • With each bounce, a roughly constant fraction of the energy is lost. The exact value depends on the material of the ball – the magic term is “coefficent of restitution” (COR). The height of each parabolic arc is f * previous_height, where f is in the range (0,1).
  • Assuming no slowdown in the horizontal direction, the distance between touch down positions (resp. duration of a bounce) shortens with the square root of f.

So far, so good, but is this model realistic? I did a few experiments tracing bouncing ball trajectories from video.

First, a tennis ball (at 50 fps):

Doing rough calculations based on the pixel positions of the ball’s center, the behavior is close enough to the model, with a COR of roundabout 0.55. Great.

Second, a very squishy rubber ball:

Surprise: The same calculations show that this ball keeps bouncing a bit higher than expected every time! The COR raises from 0.34 to 0.55 over four bounces. I even repeated the experiment, with similar results. Apparently, a non-constant COR is not unusual at slow speeds, as mentioned e.g. in the Wikipedia article on the subject.

Smoke Simulation

smoke simulation

As a first step towards a full-featured fluid simulator, I am currently working on smoke simulation, and now got something running for the simplest case of smoke in an open volume, i.e., without any solid objects or boundaries. The attached video shows 10 seconds of simulation with a small heat / velocity source at the bottom left. Looks neat already!

The implementation follows the approach laid out in the SIGGRAPH 2007 Course Notes on Fluid Animation. In brief, this is a Semi-Lagrangian advection scheme, running on a 128^3 grid. My current single-threaded CPU implementation is ridiculously slow (about 4 frames / minute on my i7 Laptop), so I am going to investigate parallelization, probably with OpenCL.

Only the ray marching volume shader utilizes the GPU so far.

I’ve had problems playing the video in firefox – you might want try another browser!