Last week Google launched a new app named Cardboard Camera.
The app enables you to scan a 360 degrees panoramic image of your surroundings and then experience it in 3D with any Google Cardboard compatible headset.
It even records an audio clip while you’re taking the panoramic shot so make sure you turn the sound on when you admire the result.
The generated file is a .vr.jpg, when opened with a text editor you can clearly see that the file is made up of Base64 encoded files.
Data corresponding to the right eye generated image and a GAudio:Data representing the sound recorded.
In OS X, In order to extract all the information from the .vr.jpg file, you will need Python, Python XMP Toolkit, Homebrew and Exempi.
To install Python XMP Toolkit download it here.
In your terminal, type:
user$ sudo python setup.py install
To install Homebrew just use curl to download and install the script in the terminal:
user$ ruby -e “$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)”
Once this is done you will need to install Exempi using Homebrew:
user$ brew install exempi
Finally you can use this Python script to extract all the necessary files:
To run the script simply unzip the file and type in your terminal:
Hello everyone, as some of you may know, I’ve always had a passion for insects from a very young age and it’s showing itself in the kind of photograph I like to take! From the first time I got my hands on a digital camera, I’ve always been a big fan of macro photography. Back in 2003, I had a good old Canon Powershot A70 and was able to take some pretty good shots with it.
Then I took some pictures with a Canon Powershot A640.
I’m using the 31mm and 13mm extension tubes on a Canon EF-S 55-250mm lens.
The Raynox DCR-250 enables me to zoom 2.5 times more so that I’m able to fill the entire frame with the insect only.
On top of that I’ve added the Bower Macro Ring Flash which enables me to take pictures with good depth of field between F/25 and F/32 which is crucial when dealing with a slim focus line.
The only drawback of the macro ring flash I’m using right now is that it really brightens some parts of the insect too much as we can clearly see the ring of light reflected right off. To fix this issue I would need to get a diffuser.
Here’s the result of some super macro pictures I took this week-end with this setup:
This is the Mini ELM327 V1.5 OBD2 II Bluetooth Diagnostic Car Auto Interface Scanner HE.
I tested it and it works very well.
Useful to diagnose your check engine light problems and get a lot of informations about the sensors of your car.
My final project at Concordia University is over, it has been a great experience working with an interdisciplinary team of software, mechanical, electrical, computer and industrial engineering students! I was responsible of programming the embedded computers on board of the rover. The embedded programming was mostly done in, C++ and C# and the operator user interface was done in .NET / MVVM. The Arduino Mega was used on the robot to interface with the multiple sensors and servo motors along with a Pico ITX board in charge of communicating with the operator’s main computer and streaming the cameras.
Here’s an article talking about it on Concordia University’s website: http://www.concordia.ca/cunews/main/stories/2014/04/01/running-on-ice-snowandmars.html
I want to announce that in the upcoming week I will review the new SAMA5D3 Xplained board and try to interface a camera to it in order to give vision to my robots, something I can’t do usually with the Arduino UNO or MEGA alone.
I will surely use the OpenCV library on it and try to interface a robot platform with it.
For more information on the Atmel SAMA5D3 Xplained Evaluation Kit, make sure to visit these links:
Last week I visited the Arcade 11 & TAG Open House at Concordia University, there were many different experiments involving games and we tried out most of them. It was a fun experience and hoping there is going to be a Montreal Joue festival next year! Here are some pictures and a video of the event:
I’m currently participating in the University of Utah Mars Rover competition. I made a small prototype of what an arm or stabilization system could look like on the robot.
Material: three 9G micro servo, one 9 DOF IMU, an Arduino Uno with a Sensor Shield V5.0 mounted on top
Here’s a stabilizer mode instead of an arm mimicking mode, useful to keep certain objects stable while they are mounted on a moving vehicle encountering bumps and obstacles.