Tag Archives: android

Implementing mixed reality with the Samsung Gear VR

The Gear VR as it stands now is one of the most affordable options to experience immersive Virtual Reality. If you own a compatible device then the  Gear VR is a must have.

In this blog I describe a Mixed Reality Android project that  I created to start learning AR/VR development for the Gear VR.

Video of the demo running in development mode.


Mixed reality is combining  AR + VR, in this project the AR part consist in displaying animated 3d models on top of tracked images. The VR part consist in proyecting the see-through camera + AR images so we can see the outside when wearing the Gear VR headset.


Project overview.

The following image shows the VR Scene setup.


In this project I’m working with the GearVR Framework (GearVRf), which is an Open Source framework to create apps for the Samsung Gear VR headset. I highly recommend have a look to the sample Apps, most of the questions regarding how to implement certain functionality with GearVRf are covered in the sample Apps code.


I took the vuforia-sample as a base for this project and then added the animated 3d models using .dae files that I exported with blender. As you can tell, the demo makes use of the Vuforia mobile sdk (vuforia-sdk-android-5-5-9) to implement the markerless AR functionality

There are several options to implement AR in mobile;for this project I wanted to track images and then display a 3d model on top of the tracked image. Vuforia already provides out of the box functionality to tag, detect images and track the generated homography; it also provides a free license for personal projects.

OpenCV is also an option, but perhaps  for another project since it will take more effort having to implement a bag-of-words model, and multi-class classification algorithm  and so on (something I’ve been learning from several Computer Vision courses in Coursera).

Required tools to build the project:

In order to build the source code you will need:


Loading the 3d models.

I got two animated 3d max models from turbosquid, before loading them using the GearVR framework I first had to import them to blender to align their orientations and then export them to collada format (.dae).


The GearVRf can load models from several 3d formats, internally GearVRf wraps the Assimp library, this makes loading 3d models and animations really easy.

Loading the t-rex model:

GVRModelSceneObject rexModel = gvrContext.loadModel("Tyrannosaurus.dae");

 Starting the animation:

List<GVRAnimation> animations = mModel.getAnimations();

if (animations.size() >= 1) {

mAnimation = animations.get(0); //get the first animation
mAnimation.setRepeatMode(GVRRepeatMode.REPEATED).setRepeatCount(-1); //loop the animation
mAnimation.start(mAnimationEngine); //start the animation



Tracking images with Vuforia.


The Vuforia Developer Portal allows you to upload the images you want to track, the online tool will generate a database which is a set of files that contain the feature descriptors of the images,  the Vuforia mobile SDK then utilizes the database file data to detect and classify the tracked  images.


The generated database files are located in the App assets folder.


Each 3D model must be associated with  the corresponding TrackedImage in the vuforia database, for this I created the class TrackedModel that holds the 3d model and the Image Id; then on each frame it updates the  3d model transform using the transformation matrix returned by Vuforia SDK during the tracking process.

public class TrackedModel extends GVRSceneObject {
private GVRModelSceneObject mModel; //the animated 3d model
private int mTrackingId; //the TrackedImage
//sets the vuforia transform matrix and scale
public void setTrackedPose(float[] matrix, float scale);
//updates the Model transform using the most recent vuforia transform and scale
private void updateTrackedPose( );


Rendering the camera see-through.

In order to render the camera frame we need to setup a RenderTexture (passThroughTexture), which is an OpenGL ES Frame Buffer Object linked to a OpenGL Texture. The passThroughTexture Id is then passed to the Vuforia Renderer Object to update the texture with the contents of the camera frame.

To render the camera frame in the Scene  a rectangle SceneObject (passThroughObject) is added to the scene and the passThroughTexture is set as the material texture.


Creating the passThroughTexture:

passThroughTexture = new GVRRenderTexture(gvrContext,


Creating the see-through rectangle:

GVRSceneObject passThroughObject = new GVRSceneObject(gvrContext, 16.0f / 9.0f, 1.0f);
material.setMainTexture(passThroughTexture); //set the passThroughTexture

Assign the passThroughTexture to Vuforia renderer:

TextureData data = new GLTextureData( passThroughTexture.getId());


Here we invoke the vuforia Renderer object to update the passThroughTexture. The onStep method is called per-frame by  GearVr framework,

public void onStep() {
if (VuforiaSampleActivity.isVuforiaActive())



Renderer.getInstance().updateVideoBackgroundTexture(); //update passThroughTexture




Running the demo.


To run the demo just build the Android Studio project, don’t forget to put your oculus osig file in the assets folder so you can test the sideloaded app in your device.

The tracked images are located in the images folder.

Code for this project in github:


Tagged , , , , ,

iaco79 blog is starting…

Hi everybody…

This is the very first entry in this blog, my intention is to provide guidance and also to receive feedback for many of you interested in developing mobile applications (mainly the cool stuff  sound, graphics and games) , and perhaps other not game related projects

I’ll start by talking a little bit of myself:

I’m from Monterrey, Mexico and got my Bachelor Degree in Computer Science in 2004 from the “Universidad Autonoma de Nuevo Leon”, throughout my professional career I have participated in several projects mostly working with Object Oriented languages such as c#, c++, java and javascript

I grow up in the 80s and 90s during the golden age of the console wars (nintendo vs sega) and the emerge of the very first 3d consoles, now as a grownup I still recall those good old times and very good old games

So that’s why I’m here as I’ve been trying to release a game of my own, and as of right now it has never been easy than ever before to achieve this:  you have at your disposal google play market, apple store , steam , ouya , all those markets are waiting for you to publish your very own new blockbuster app, there are also available thousands of open source engines and resources on the net that can teach you how to develop your game from the ground up.

At this moment I’m in the final phase of my game project to be released initially for the android market, all the development was done with open source libraries with zlib and mit license (free as free)

In the following posts I’ll describe the basic building blocks that made up this project so you can use it as a base for your own developments,  I’m not going to describe how to build a complete integrated game engine , but an example of how to assemble different open source libraries (listed below) and make a complete mobile application with the following features:

All of the above are c / c++ based and are cross-platform which means that with little effort you can have your game running in android , ios and windows (For instance I do all my development and testing in a windows machine, and then I do an Android APK build to share with my friends so they can give me their feedback)

Also my main goal is to work at a native level to learn as much as possible how thing works under the hood , so I’m avoiding working with complete game development packages such as unity 3d

Well I think that’s it for now…

Tagged , , , , , ,