Author: Pascal


FacebooktwitterredditpinterestlinkedintumblrmailLast week Google launched a new app named Cardboard Camera.
The app enables you to scan a 360 degrees panoramic image of your surroundings and then experience it in 3D with any Google Cardboard compatible headset.
It even records an audio clip while you’re taking the panoramic shot so make sure you turn the sound on when you admire the result.

The generated file is a .vr.jpg, when opened with a text editor you can clearly see that the file is made up of Base64 encoded files.
Data corresponding to the right eye generated image and a GAudio:Data representing the sound recorded.
In OS X, In order to extract all the information from the .vr.jpg file, you will need Python, Python XMP Toolkit, Homebrew and Exempi.
To install Python XMP Toolkit download it here.
In your terminal, type:
user$ sudo python install
To install Homebrew just use curl to download and install the script in the terminal:
user$ ruby -e “$(curl -fsSL
Once this is done you will need to install Exempi using Homebrew:
user$ brew install exempi

Finally you can use this Python script to extract all the necessary files:
To run the script simply unzip the file and type in your terminal:
./  IMG_FILE.vr.jpg

Original Image (Left Eye)

Transformed Image (Right Eye)

Audio File

Images as seen on the phone using the Cardboard Camera app

Trying Out the Headset


Screenshot from the Cardboard Camera application (cross your eyes and bring the image together to see it 3D)


FacebooktwitterredditpinterestlinkedintumblrmailHello everyone, as some of you may know, I’ve always had a passion for insects from a very young age and it’s showing itself in the kind of photograph I like to take! From the first time I got my hands on a digital camera, I’ve always been a big fan of macro photography. Back in 2003, I had a good old Canon Powershot A70 and was able to take some pretty good shots with it.

Then I took some pictures with a Canon Powershot A640.

I decided it was time to upgrade and got myself a Canon Rebel T2i in 2010. Every year I was buying new things for it like macro extension tubes, a Bower Macro Ring Flash and more recently a Raynox DCR-250.

I’m using the 31mm and 13mm extension tubes on a Canon EF-S 55-250mm lens.

The Raynox DCR-250 enables me to zoom 2.5 times more so that I’m able to fill the entire frame with the insect only.

On top of that I’ve added the Bower Macro Ring Flash which enables me to take pictures with good depth of field between F/25 and F/32 which is crucial when dealing with a slim focus line.

The only drawback of the macro ring flash I’m using right now is that it really brightens some parts of the insect too much as we can clearly see the ring of light reflected right off. To fix this issue I would need to get a diffuser.

Here’s the result of some super macro pictures I took this week-end with this setup:



This is the Mini ELM327 V1.5 OBD2 II Bluetooth Diagnostic Car Auto Interface Scanner HE.
I tested it and it works very well.
Useful to diagnose your check engine light problems and get a lot of informations about the sensors of your car.


FacebooktwitterredditpinterestlinkedintumblrmailMy final project at Concordia University is over, it has been a great experience working with an interdisciplinary team of software, mechanical, electrical, computer and industrial engineering students! I was responsible of programming the embedded computers on board of the rover. The embedded programming was mostly done in, C++ and C# and the operator user interface was done in .NET / MVVM. The Arduino Mega was used on the robot to interface with the multiple sensors and servo motors along with a Pico ITX board in charge of communicating with the operator’s main computer and streaming the cameras.
Here’s an article talking about it on Concordia University’s website:
Saved PDF in case the link becomes broken: Concordia Mars Rover 2014 Captstone_Project Article


FacebooktwitterredditpinterestlinkedintumblrmailI want to announce that in the upcoming week I will review the new SAMA5D3 Xplained board and try to interface a camera to it in order to give vision to my robots, something I can’t do usually with the Arduino UNO or MEGA alone.
I will surely use the OpenCV library on it and try to interface a robot platform with it.

For more information on the Atmel SAMA5D3 Xplained Evaluation Kit, make sure to visit these links:

Atmel SAMA5D3 Xplained evaluation kit specifications
Atmel SAMA5D3 Xplained Hardware



FacebooktwitterredditpinterestlinkedintumblrmailI’m currently participating in the University of Utah Mars Rover competition. I made a small prototype of what an arm or stabilization system could look like on the robot.
Material: three 9G micro servo, one 9 DOF IMU, an Arduino Uno with a Sensor Shield V5.0 mounted on top

Here’s a stabilizer mode instead of an arm mimicking mode, useful to keep certain objects stable while they are mounted on a moving vehicle encountering bumps and obstacles.