Venturing into social VR with Pocket Observatory!

The past few weeks I’ve been working away on a really exciting feature for the upcoming Gear VR version of Pocket Observatory: You will be able to invite a friend (on the Oculus platform) and start a voice chat beneath the stars! GPS coordinates are exchanged between the app instances, so players can visit each other’s GPS locations. This is currently under review, and will hopefully be up in a few weeks in the Oculus Store.

To my knowledge, this is the very first social VR astronomy app ever! I’ve been thinking about this for quite a while during initial development, but didn’t realize how easy it would be to integrate using the Oculus platform SDK. Mind you, setting up peer-to-peer networking can still be nerve-wrecking, given the unreliable nature of network communication, but still… managed to pull this off in just a few weeks. Happy!

Check out the updated page at https://pocketobservatory.com for the details. Here’s a screenshot of the chat UI: (thinking about avatars and a shared space experience, too, but that’s for later.)

Educational VR molecules

Now that I’ve gained some experience with Virtual Reality and my astronomy app, I’m thinking educational software for VR could be a worthwhile field for future projects. So I’ve started tossing ideas about, one of which involves playing with molecules in a VR environment.

Aspirin molecule

Pocket Observatory for Google Cardboard

Just finished and submitted the iPhone / Google Cardboard version of Pocket Observatory! It really paid off to use Unity – porting from Android with the Oculus SDK to iPhone with GoogleVR turned out to be really easy.

Here are the quirks I encountered, might be useful to know if you’re embarking on a similar project:

  • In Gear VR, system messages (e.g., asking for permissions) are displayed properly and can be confirmed while you’re in VR. On the iPhone, a standard system dialog pops up. To deal with location service permissions, I trigger the message from within a special startup scene, before entering VR mode in the main scene.
  • Texture compression support has to be adjusted with the platform. On the iPhone, compression defaults to PVRTC, which requires square textures. The Unity importer stretches non-square textures to make them  compressible with PVRTC. This results in awful artifacts, so I had to go over the compressions options for all of my (non-square) textures.
  • Make sure the text for camera use permission is set in the iOS player settings – in GoogleVR, there is a UI button to allow the user to switch viewers. This will activate the phone’s camera in order to scan the QR code on the viewer. Not setting the text will result in an app crash.
  • Unfortunately, the Cardboard app doesn’t run in the simulator – there is no suitable architecture of the gvr library, so the app crashes at startup. I guess it would be possible to build the library from source, but haven’t tried that yet.

Visit https://pocketobservatory.com for details regarding app features and release plans!

Pocket Observatory: Finalizing!

Yes, I have decided on a name for my upcoming Gear VR astronomy app: Tadaa – Pocket Observatory! Seems it is getting to a decent stage… it’s hard to stop adding features when new ideas pop up every five minutes, but this all has to wait for future releases. Now, it’s all about polishing and optimizing!

I found a really, really helpful guide to optimizing Gear VR apps on the Oculus developer blog: https://developer3.oculus.com/blog/squeezing-performance-out-of-your-unity-gear-vr-game/. This certainly saved me a few headaches.

Here’s a very first video impression of the app: https://youtu.be/G4tHM2v0NyY I was actually wondering how I could do a video such as this, but it turned out to be super easy: Integrate the platform menu provided with the OVR toolkit, and the function is available in the platform menu by pressing the back button 🙂

‘Simple’ property animation in Unity

Recently, I’ve spent a few hours with Unity’s new animation system, Mecanim. I wasn’t working on any complex animations, but only wanted to implement fading of a few elements in my scene. This turned out to be a real desaster. Why?

  • For simply fading a property in an out, I needed two animation clips and four states, with pretty sensitive transition setup (I wanted the fading to be interruptible, etc.)
  • The animations cannot go from the current property value to the target at a certain speed. They will always animate from start to end.
  • Adding a delay would require either to modify the animation clip or do even more complex state setup.

So, after fiddling with this for a while, I came to the conclusion that this kind of animation is better done in code (which seems to be what people in the forums think, too). I’ve set up a simple class for fading a property value, added the logic to my MonoBehaviour script, and had it working the way I wanted within 20 minutes. Go figure.

For anyone interested, here is the code. It could probably be improved in various ways, but it’s doing what I want for my project. Feel free to play with it.

ValueFader.cs:

 public struct FadeParameters
 {
   public float initialValue;
   public float maxValue;
   public float minValue;
   public float fadeInDuration;
   public float fadeOutDuration;
   public float fadeInDelay;
     // applies only when going from Idle to FadingIn
     // (not e.g. FadingOut -> FadingIn)
 }
 
 public class ValueFader
 {
   private enum FadeState
   {
     FadingIn,
     FadingOut,
     Idle
   }
 
   private float _remainingDelay;
   private float _currentValue;
   private FadeParameters _fadeParameters;
   private FadeState _currentFadeState = FadeState.Idle;
   private float _fadeInIncrement;
   private float _fadeOutIncrement;
   public float currentValue { get { return _currentValue; } }
 
   public ValueFader(FadeParameters fp)
   {
     _fadeParameters = fp;
     _fadeInIncrement =
       (_fadeParameters.maxValue - _fadeParameters.minValue)
       / _fadeParameters.fadeInDuration;
     _fadeOutIncrement =
       (_fadeParameters.maxValue - _fadeParameters.minValue)
       / _fadeParameters.fadeOutDuration;
     _currentValue = fp.initialValue;
     _remainingDelay = fp.fadeInDelay;
   }
 
   public void FadeIn()
   {
     if (_currentValue < _fadeParameters.maxValue)
     {
       _currentFadeState = FadeState.FadingIn;
     }
     else
     {
       SetIdle();
     }
   }
 
   public void FadeOut()
   {
     if (_currentValue > _fadeParameters.minValue)
     {
       _currentFadeState = FadeState.FadingOut;
     }
     else
     {
       SetIdle(); // might have been in FadingIn, during delay
     }
   }
 
   // Returns true, if currentValue was changed by this method
   public bool Update(float deltaTime)
   {
     if (_currentFadeState == FadeState.Idle)
     {
       return false;
     }
 
     bool retVal = false;
 
     if (_currentFadeState == FadeState.FadingIn)
     {
       if (_remainingDelay <= 0.0f)
       {
         _currentValue += _fadeInIncrement * deltaTime;
         if (_currentValue >= _fadeParameters.maxValue)
         {
           _currentValue = _fadeParameters.maxValue;
           _currentFadeState = FadeState.Idle;
         }
 
         retVal = true;
       }
       else
       {
         _remainingDelay -= deltaTime;
       }
     }
     else //if (_currentFadeState == FadeState.FadingOut)
     {
       _currentValue -= _fadeOutIncrement * deltaTime;
       if (_currentValue <= _fadeParameters.minValue)
       {
         _currentValue = _fadeParameters.minValue;
         SetIdle();
       }
 
       retVal = true;
     }
 
     return retVal;
   }
 
   private void SetIdle()
   {
     _currentFadeState = FadeState.Idle;
     _remainingDelay = _fadeParameters.fadeInDelay;
   }
 }
 

And here is how you would use it, from within a MonoBehavior script.

Set up the animation parameters like this, and store the ValueFader in a class member:

FadeParameters imageFP;
imageFP.fadeInDuration = 1.0f;
imageFP.fadeOutDuration = 0.7f;
imageFP.initialValue = 0.0f;
imageFP.maxValue = 0.1f;
imageFP.minValue = 0.0f;
imageFP.fadeInDelay = 1.0f;

_imageFader = new ValueFader(imageFP);  

How you control fading values in or out, depends entirely on your logic. You might want to do it based on some event:

void OnSomethingHappened()
 {
   ...
   if (startFadingIn)
   {
     _imageFader.FadeIn();
   }
   else if (startFadingOut)
   {
     _imageFader.FadeOut();
   }
 } 

You get the idea… In Update(), apply the animated value to a material color, or whatever you want to animate:

void Update()
{
  if (_imageFader.Update(Time.deltaTime))
  {
    _imageMaterial.SetColor(
      "_Color", new Color(1.0f, 1.0f, 1.0f, _imageFader.currentValue));
  }
  ...
 } 

 And that’s all 🙂

3D Text with depth test in Unity

The 3D Text asset built into Unity renders always on top, regardless of z depth. This helps to avoid z-fighting issues, if your text is coplanar to other geometry in the scene, but often not what you want. I have labels in my scene which I DO want to be hidden by geometry in front. Unfortunately, there is no simple switch to turn on depth testing – even though users have been struggling with this for years, as far as I can tell.

The solution requires a modified font shader, and to be able to use that, you also need to have a font texture in your project. I had to retrieve all the bits and pieces of information from various places, so I though it might be a good idea to list all the steps together. Here we go:

  1.  Download the Unity built-in shader archive from https://unity3d.com/get-unity/download/archive.
  2. Extract the .zip (at the time of writing: builtin_shaders-5.4.1f1.zip) into some arbitrary folder.
  3. Import DefaultResources/Font.shader into your project.
  4. Rename it, e.g. to ZFont.shader.
  5. Edit the shader source, and change “ZTest Always” to “ZTest LEqual”. Also change the name, e.g. to “GUI/ZText Shader”.
  6. Create a new material, and link it to your new shader.
  7. Import a font into the project. This is as easy as dragging a .ttf into the project window. I used OpenSansRegular.ttf from a sample project.
  8. Show your material in the inspector.
  9. Unfold the font entry in the project window. You will see a “Font Texture” entry. Drag that into the “Font Texture” area of the material displayed in the inspector.
  10. In the Text Mesh where you want to use the new shader, change the Mesh Renderer material to your new material. Change the Text Mesh font to your imported font.

And you’re done!

Custom assets with Unity

Today I’ve taken the time to clean up the architecture of my VR stars Unity app (I really have to find a name for this project!). One really cool Unity feature is the ability to create custom assets by scripts run in the editor, so you don’t have to build complex structures at runtime.

Why is this so fantastic? Well, usually I would read star positions, constellation data, etc. from data files at application startup, and then create appropriate data structures and meshes. Strictly speaking, this isn’t necessary, since that data is essentially static. With Unity, I can now read all the data already at edit time, and create a complete “star sphere” with meshes for the constellations as a ready-to-use asset.

As a bonus, this also allows me to investigate the created geometry and data structures in the editor (see the picture), without having to resort to runtime debug utilities. Very nice!

To the stars with Gear VR!

Getting more serious about VR… the HTC Vive is extremely cool, but quite some investment for starters. Decided to get myself a decent Android phone and a Gear VR, and started porting my astronomy app Cor Leonis to Unity.

Good progress so far! Stars, planets, moon, and sun are all in, reading device location and compass is a breeze in Unity, so I can already have a view of the current night sky in VR 🙂

Now on to make things pretty and creating a cool experience!

Lightbox Trace 2.0 with In-App feature

I recently published version 2.0 of my iPad App Lightbox Trace, introducing a filter panel as an optional In-App feature. This really comes in handy, when your sketches don’t have enough contrast or you want to desaturate a colored image before tracing. Check it out!

Unfortunately, I ran into a problem with enabling the In-App purchase, so it just didn’t work for about a week. If you tried and got an error message about not being able to connect to the App Store, please try again. It really should be working now. If there are still any problems, please send me an e-mail. I appreciate your support!