Category Archives: Mobile Development

A simple Ip-TV viewer for iOS

Adding video streaming functionality within an App is becoming more and more important, health care organizations (telemedicine), hospitality (marketing), education (training)  are just a few industries that have adopted real time media to conduct their business.

There are several real time protocols to handle video streaming connections like RTMP, HLS, etc. Many popular “free” internet tv channels are delivered using protocols based on HLS through m3u8 playlists.

In this project I decided to create a sample App that shows how to play m3u8 playlists, the app is based on the ijkmedia framework which in turn is based on the ffmpeg project, which is a set of open source libraries that can handle all sort of media formats and protocols.

Video of the app.

 

The project consists of 3 main views:

  1. The Player view controler.
    • Contains the ijkmedia player
    • Provides also additional controls for tagging a favorite, set the thumbnail and description.
  2. A search panel.
    • Links are obtained by doing web scraping to a set of pages.
  3. A favorites container.
    • Presents the tagged favorite m3u8 links.
    • Favorites can be re-arranged by dragging them.

iacoplayer2

The UI part also makes use of core animation and  swipe gestures to activate the search, favorites and drawer views.

 

Main Components:

iacoplayer1

  • Video Manager.
    • Handles the user favorites and search requests.
    • The search results and favorites  data are stored using core data.
    • Thumbnails are stored as PNGs in the app sandbox.
  • Web scraper.
    • Searches for m3u8 playlists in the provided web page link using regexp.
  • ijkmedia/ffmpes.
    • Open source framework  for handling the streaming and playback session.

 

The app source code and build instructions are in the following github repository:

https://github.com/iaco79/IaCOPlayer.

 

 

 

 

 

 

 

 

 

 

 

 

 

Tagged , , ,

Implementing mixed reality with the Samsung Gear VR

The Gear VR as it stands now is one of the most affordable options to experience immersive Virtual Reality. If you own a compatible device then the  Gear VR is a must have.

In this blog I describe a Mixed Reality Android project that  I created to start learning AR/VR development for the Gear VR.

Video of the demo running in development mode.

 

Mixed reality is combining  AR + VR, in this project the AR part consist in displaying animated 3d models on top of tracked images. The VR part consist in proyecting the see-through camera + AR images so we can see the outside when wearing the Gear VR headset.

AR_IMG1

Project overview.

The following image shows the VR Scene setup.

AR_IMG3

In this project I’m working with the GearVR Framework (GearVRf), which is an Open Source framework to create apps for the Samsung Gear VR headset. I highly recommend have a look to the sample Apps, most of the questions regarding how to implement certain functionality with GearVRf are covered in the sample Apps code.

 

I took the vuforia-sample as a base for this project and then added the animated 3d models using .dae files that I exported with blender. As you can tell, the demo makes use of the Vuforia mobile sdk (vuforia-sdk-android-5-5-9) to implement the markerless AR functionality

There are several options to implement AR in mobile;for this project I wanted to track images and then display a 3d model on top of the tracked image. Vuforia already provides out of the box functionality to tag, detect images and track the generated homography; it also provides a free license for personal projects.

OpenCV is also an option, but perhaps  for another project since it will take more effort having to implement a bag-of-words model, and multi-class classification algorithm  and so on (something I’ve been learning from several Computer Vision courses in Coursera).

Required tools to build the project:

In order to build the source code you will need:

 

Loading the 3d models.

I got two animated 3d max models from turbosquid, before loading them using the GearVR framework I first had to import them to blender to align their orientations and then export them to collada format (.dae).

AR_IMG4

The GearVRf can load models from several 3d formats, internally GearVRf wraps the Assimp library, this makes loading 3d models and animations really easy.

Loading the t-rex model:


GVRModelSceneObject rexModel = gvrContext.loadModel("Tyrannosaurus.dae");

 Starting the animation:


List<GVRAnimation> animations = mModel.getAnimations();

if (animations.size() >= 1) {

mAnimation = animations.get(0); //get the first animation
mAnimation.setRepeatMode(GVRRepeatMode.REPEATED).setRepeatCount(-1); //loop the animation
mAnimation.start(mAnimationEngine); //start the animation

}

 

Tracking images with Vuforia.

 

The Vuforia Developer Portal allows you to upload the images you want to track, the online tool will generate a database which is a set of files that contain the feature descriptors of the images,  the Vuforia mobile SDK then utilizes the database file data to detect and classify the tracked  images.

AR_IMG5

The generated database files are located in the App assets folder.

AR_IMG6

Each 3D model must be associated with  the corresponding TrackedImage in the vuforia database, for this I created the class TrackedModel that holds the 3d model and the Image Id; then on each frame it updates the  3d model transform using the transformation matrix returned by Vuforia SDK during the tracking process.


public class TrackedModel extends GVRSceneObject {
...
...
private GVRModelSceneObject mModel; //the animated 3d model
private int mTrackingId; //the TrackedImage
 
//sets the vuforia transform matrix and scale
public void setTrackedPose(float[] matrix, float scale);
...
//updates the Model transform using the most recent vuforia transform and scale
private void updateTrackedPose( );

AR_IMG7

Rendering the camera see-through.

In order to render the camera frame we need to setup a RenderTexture (passThroughTexture), which is an OpenGL ES Frame Buffer Object linked to a OpenGL Texture. The passThroughTexture Id is then passed to the Vuforia Renderer Object to update the texture with the contents of the camera frame.

To render the camera frame in the Scene  a rectangle SceneObject (passThroughObject) is added to the scene and the passThroughTexture is set as the material texture.

AR_IMG8

Creating the passThroughTexture:

passThroughTexture = new GVRRenderTexture(gvrContext,
 VUFORIA_CAMERA_WIDTH, VUFORIA_CAMERA_HEIGHT);

 

Creating the see-through rectangle:

GVRSceneObject passThroughObject = new GVRSceneObject(gvrContext, 16.0f / 9.0f, 1.0f);
…
…
material.setMainTexture(passThroughTexture); //set the passThroughTexture

Assign the passThroughTexture to Vuforia renderer:

TextureData data = new GLTextureData( passThroughTexture.getId());

Renderer.getInstance().setVideoBackgroundTexture(data);

Here we invoke the vuforia Renderer object to update the passThroughTexture. The onStep method is called per-frame by  GearVr framework,

@Override
public void onStep() {
…
…
if (VuforiaSampleActivity.isVuforiaActive())

{

Renderer.getInstance().begin();

Renderer.getInstance().updateVideoBackgroundTexture(); //update passThroughTexture

Renderer.getInstance().end();

}

 

Running the demo.

AR_IMG2

To run the demo just build the Android Studio project, don’t forget to put your oculus osig file in the assets folder so you can test the sideloaded app in your device.

The tracked images are located in the images folder.

Code for this project in github:

Links:

Tagged , , , , ,