3D Text with depth test in Unity

The 3D Text asset built into Unity renders always on top, regardless of z depth. This helps to avoid z-fighting issues, if your text is coplanar to other geometry in the scene, but often not what you want. I have labels in my scene which I DO want to be hidden by geometry in front. Unfortunately, there is no simple switch to turn on depth testing – even though users have been struggling with this for years, as far as I can tell.

The solution requires a modified font shader, and to be able to use that, you also need to have a font texture in your project. I had to retrieve all the bits and pieces of information from various places, so I though it might be a good idea to list all the steps together. Here we go:

  1.  Download the Unity built-in shader archive from https://unity3d.com/get-unity/download/archive.
  2. Extract the .zip (at the time of writing: builtin_shaders-5.4.1f1.zip) into some arbitrary folder.
  3. Import DefaultResources/Font.shader into your project.
  4. Rename it, e.g. to ZFont.shader.
  5. Edit the shader source, and change “ZTest Always” to “ZTest LEqual”. Also change the name, e.g. to “GUI/ZText Shader”.
  6. Create a new material, and link it to your new shader.
  7. Import a font into the project. This is as easy as dragging a .ttf into the project window. I used OpenSansRegular.ttf from a sample project.
  8. Show your material in the inspector.
  9. Unfold the font entry in the project window. You will see a “Font Texture” entry. Drag that into the “Font Texture” area of the material displayed in the inspector.
  10. In the Text Mesh where you want to use the new shader, change the Mesh Renderer material to your new material. Change the Text Mesh font to your imported font.

And you’re done!

Custom assets with Unity

Today I’ve taken the time to clean up the architecture of my VR stars Unity app (I really have to find a name for this project!). One really cool Unity feature is the ability to create custom assets by scripts run in the editor, so you don’t have to build complex structures at runtime.

Why is this so fantastic? Well, usually I would read star positions, constellation data, etc. from data files at application startup, and then create appropriate data structures and meshes. Strictly speaking, this isn’t necessary, since that data is essentially static. With Unity, I can now read all the data already at edit time, and create a complete “star sphere” with meshes for the constellations as a ready-to-use asset.

As a bonus, this also allows me to investigate the created geometry and data structures in the editor (see the picture), without having to resort to runtime debug utilities. Very nice!

To the stars with Gear VR!

Getting more serious about VR… the HTC Vive is extremely cool, but quite some investment for starters. Decided to get myself a decent Android phone and a Gear VR, and started porting my astronomy app Cor Leonis to Unity.

Good progress so far! Stars, planets, moon, and sun are all in, reading device location and compass is a breeze in Unity, so I can already have a view of the current night sky in VR 🙂

Now on to make things pretty and creating a cool experience!

Lightbox Trace 2.0 with In-App feature

I recently published version 2.0 of my iPad App Lightbox Trace, introducing a filter panel as an optional In-App feature. This really comes in handy, when your sketches don’t have enough contrast or you want to desaturate a colored image before tracing. Check it out!

Unfortunately, I ran into a problem with enabling the In-App purchase, so it just didn’t work for about a week. If you tried and got an error message about not being able to connect to the App Store, please try again. It really should be working now. If there are still any problems, please send me an e-mail. I appreciate your support!

VR User interface experiments

I’m currently experimenting with the UI for my upcoming Gear VR star gazing application. Virtual reality user interfaces are really interesting, since they have to work so differently from a standard 2D UI. One possible realization is to have interactive elements as actual 3D objects in the scene. This can be fun! For my app, I am thinking about putting a “time machine” into the scene, which will allow you to move forward and backwards in time for different views of the sky. Much cooler than having a 2D number selection thingy. Nothing to show yet, but stay tuned!

How to select and activate anything in a VR scene can be a science. For starters, I recommend having a look at Unity’s VR Sample Scenes project. It includes a bunch of useful scripts for reticles, selection radials, VR input handling, etc. This looks pretty convoluted at first, but once you get your head around it, it offers some nice ideas on how to architect an application UI.

New iOS app: Lightbox Trace

I’m currently spending a lot of time drawing on my iPad Pro, and needed a way to transfer my digital sketches to drawing paper. Essentially, a lightbox with the ability to display an image. Since I still had my old first-generation iPad lying around, I developed a simple little app to put it to use again: Lightbox trace.

  • Load an image from photos or the clipboard
  • Scale, position, rotate as desired
  • lock the screen – the app now ignores all touch events, so you can put a piece of paper on the display and trace the image
  • display brightness is automatically increased to the maximum
  • you can also just show white, for tracing from one paper to another

I’ve found it to be quite useful – please try it (it’s free!) out and let me know if there is anything you might want to be added.

Arduino Prototyping: It’s a clock!

Over the past few months, I digged more deeply into the Arduino platform. One ongoing project is a clock with moon phase display (since I already implemented the computations for my astronomy app, Cor Leonis). I started out with an LED matrix and 7-segment displays like this:

Tons of wire!

Over time, I decided to use 2 8×8 LED matrices, switched to a smaller Arduino compatible board (Adafruit Pro Trinket), and ran it on batteries:

There’s also a button to switch between views now.

It’s far from done, but I find it amazing how much I already learned from this relatively simple project… a refresher on basic electronics (resistors, capacitors, etc.) and soldering,  manual LED matrix display multiplexing, more on LEDs than I ever wanted to know, RTC clock chips, LED display driver chips, shift registers, step-up/down voltage converters, debouncing HW buttons, I2C bus wiring and communications, calculating power consumption and battery lifetime, and so on and so forth. Next up is sensors: I would like to switch views just by waving my hand (and see how robust that is), instead of having to walk over and press a button.

Cor Leonis 5.0 released

After a looooong break, I picked up development on my astronomy app, Cor Leonis, again. The latest version 5.0 is available now for iOS devices in the Apple App Store. The one big feature which justifies the major version jump: the moon! While I was working on the moon info panel, I also beefed up information about the planets in our solar system a lot. Hope you like it!

New demo: moon shader

When I was thinking about how to put the moon into the 3D view of my astronomy app, I figured it would be a waste to actually display a textured sphere. After all, we always see the same side of the moon. All that is needed is a textured quad, and a shader which emulates a lit sphere. In the end, the quad was reduced to a single vertex – a point sprite. Try it out in the demos section: WebGL moon shader

Animated IFS

This shows a combination of an IFS (iterated function system) and a particle system. In an IFS, points are attracted towards a fractal shape by iterating the positions over a set of affine transformations. The original algorithm starts with just one random point and plots its current position over a series of randomized transformations. In this example, I instead start out with a number of randomly distributed particles, which are then given target positions by running the IFS transformations.

When the general shape has settled, one additional iteration is performed every few seconds. It’s instructive to see how particles are warped from one place on the fractal shape to a completely different position.

The demo is implemented in Processing using processing.js. With 5000 particles moving around, performance depends on your browser and hardware. Unfortunately, PShape support is not yet complete in processing.js, which would probably help to speed this up.

Run demo in a popup