Final Week Final Blog!!

Hey Everyone, as the project submission deadline nears I have been working to finish my project. The  Heads Up Display(HUD)  is rendering perfectly fine on the Oculus  interacting with the leap motion and receiving data from the server. Till now, I was able to integrate the HUD with my own demo model in blender and simulate it on to the Oculus. The Final Step involved integrating the HUD with one of the model scenes available on the IMS V-ERAS repository.

This was a difficult task as the Italian Mars Society initially started to simulate models through the Blender Game Engine for the DK1 but as the support for Blender by Oculus was very limited they decided to shift to Unity Instead. Currently, they are working only on Unity and as I am working under PSF organization using unity was not an option. Also since there has been a lot of changed from Rift DK1 to Dk2, most of the  models were unable to render successfully on the Oculus.

I had a little chat with my mentor about this issue, and he asked me to integrate the HUD with the models that were fully rendered on DK2 and forget about the other ones. After that, I found an avatar model and began work to integrate the HUD on to it.ouple of days later, I was able to render the HUD on to the Oculus with one of V-ERAS models through Blender Game Engine and the result seems good.

Couple of days later, I was able to render the HUD on to the Oculus with one of V-ERAS models through Blender Game Engine and the result seems good.

hud

 

 

Currently, I am trying to make the HUD an addon in Blender Game Engine so that it can be imported into any model/scene and render successfully on the Oculus.

As for the final submission , I have started with the documentation and hopefully ,I will submit by the end of this week.

Final Post 😦

 

 

Work Continues!

Hey All, as the deadline for final submission nears its end , I have been working to finalize my project to fix the few bugs that keep on showing up and annoy me. The Project is about 80% done according to me :p. I am able to render the HUD Dynamically on the Oculus now with my own demo model made in the Blender Game Engine. The Only Work left now is to integrate my HUD with the V-ERAS models for Blender and render the final scene on the Oculus DK2. I have being going through the models available on the repository of IMS V-ERAS to find a model suitable to render the HUD on.

My College has also started, so for these final days will be very hectic. Hope for the Best

Oculus working yaay

Hey everyone, Sorry for this late post.

I was busy setting up the Oculus, It has been a pain but at the end a sweet one :p. A week before, I was down thinking of even quitting the program . I had my code ready to run but it just wouldn’t show up on the Oculus . I was lost , but somewhere inside I knew I could do it. So I got up one last time, sat through the day tweaking my code, tweaking the Blender Game Engine , changing  configuration for Oculus and At last Bazzingaa.

Thank God I said to myself and eventually my code was running on the Oculus:p.

Here is a link to the DEMO VIDEO GSOC

Oculus A Guide Week 2

The Next week started with a Skype Call from my mentors, I told them my problems they understood very well. The mentors gave me hope they told me to try something else, something not on their documentation for the Oculus Setup on Blender. They were very understanding and told me if I wanted to switch to unity we can see to that as well.

That call gave me motivation,  to go for it one more time to try again. And well as I write this post , I have hope. I found a solution online , talked about it with my mentor and woah it was agreed upon.

Although I have lost a lot of time in setting up, yet know I can look forward.

Thanks

Oculus A Guide – Week 1

Midterms are over and I passed it with a good evaluation from my Mentors. Now it was time to move onto the second part of the project , the main part .i.e the Oculus Rift DK2. I had some troubles setting up the oculus during the community bonding period and I thought I will resolve those issues during the coding period after mid-terms. But Heck , I was wrong . The Issues were mainly related to Oculus support for Linux and Blender. I had to use Linux as the organisation’s environment was setup on it and Blender as well for the same reason.

There were docs that were provided by my mentor from the Italian Mars Society but they were insufficient as the docs were meant for the Rift DK1 and I had a DK2 . But that shouldn’t be such a problem right, wrong completely wrong. I spent days trying to figure out how to setup the Oculus for my project. My mentor was trying to help but still they weren’t sure as they hadn’t used the DK2 on Ubuntu for Blender. I kept trying for a week and was stuck , literally stuck. Frankly, I hadlost hope I was very disappointed .

Midterms are here!

Hey, all Midterms are coming this week and well the first part of my proposal is about done. Some Documentation work is all that’s remaining.

Well, my first part involved integrating the Leap Python API and Blender  PythonAPI and initially It was tough to integrate as the Leap API is designed for Python2.7 whereas the Blender only supported Python3.5. Hence I wasn’t able to integrate at first, thus I came to another solution. The other solution was to send data from the python2.7 script to the blender Python API using a socket connection. This worked well however it was somewhat inefficient and slow. Hence I thought I will try to generate a Python3.5 wrapper for the LeapSDK. however after days of trying I found a Solution with a little help from the Community and thus was able to generate a Python3.5 Wrapper for the Leap SDK through Swig. The Wrapper worked perfectly fine and thus I successfully

Hence I thought I will try to generate a Python3.5 wrapper for the LeapSDK. This sounded easy but was anything but easy. I found little support online, there were pages that were meant to help but most of them were for windows and very few for ubuntu. However after days of trying I found a Solution with a little help from the Community and thus was able to generate a Python3.5 Wrapper for the Leap SDK through Swig. The Wrapper worked perfectly fine and thus I successfully integrated the two API.

I talked to my mentor to talk about the gesture support required for the project and added more gesture support.

I am now documenting my code and preparing myself for the second part of my project.

Leap Motion Working

After wasting days in trying to generate a Python 3.5 wrapper for the Leap API, I decided I will make a socket connection between the Python2.7 script that gets tracking data from leap motion and the Blender Game Engine. So I started to work on the script to recognize the gestures and everything and was able to successfully send the tracking data of Leap Motion Controller to BGE through the socket connection.

Next, I had to make sure that my Habitat Monitoring Server works as well and Thus I configured by Blender Client to get the data from leap motion and use that to get data from the HMC server.

Woohoo it worked!! I made a small demo video Youtube and pushed my code to Github!!

The Leap Motion Connundrum

Well, the first part of my proposal involved integrating the Leap Motion Controller and the Blender Game Engine(BGE). I thought this wouldn’t be so hard as leap motion has a leap python API and so does Blender. So my task was simple, integrate Blender python API(bpy) with Leap Python API. Right?

No ways, as I started to work on my integration I ran into a small problem ( large as it seems now). Turns out the official Leap Python API supports only python2.7 and the bpy runs on python 3.5. Well, that was a concern. I was able to obtain the Leap data from a python2.7 script quite easily. All gestures were detected fine and I thought all I had to do was port the 2.7 script to 3.5 in the Blender. But no, it wasn’t that easy. After porting the script and running in blender I came up across all sorts of errors related to the Leap API.

It jusn’t wasn’t working.

I searched up the Leap community for answers and thankfully I found one related to making a Python3.3 Wrapper using Swig for Leap API. So I looked up the solution and turns out they have no exact steps for generating a Wrapper in Ubuntu. Well, technically they had a link to another page for Ubuntu but that wasn’t opening at all. For Windows and MacOSX it was all there. Unfortunately not for Ubuntu 😦

So I kept on searching for a solution and whenever I thought I found the solution I ran into the same problem they all had links to the same web page which wouldn’t open at all.

So I decided I will try to generate the wrapper by myself learning about Swig and all. Well, I tried and tried and tried yet couldn’t fix the problem. One evening, I even thought that my project will be dead. But then another solution came up to my mind and then I

But then another solution came up to my mind “A socket connection between python2.7 and bpy”. I was able to get all the tracking data of Leap motion through my 2.7 script so I would send this to the BGE through a Socket connection. I consulted my mentors and they said it sounds good and now I am up and running again.

Finally Coding and Getting results!

After a bad(expected) start, I was able to achieve some part of my project. I understood the existing Habitat Monitoring Client (HMC) server as I had to modify this server for my project. I played around with it , tried to get the necessary data. The HMC server is a GUI based server and I had to use the server without the GUI. Basically, my task was to remove the GUI from the server and modify it more so that the data can be sent to the Blender Client.

I began coding the server and was able to get able to get some success initially. I was in touch with my mentor and after 3 days was able to get the server working completely without the GUI. It was working fine, as required for my project and I contacted my mentor to tell him about it. He asked me to push my code to my repository on bitbucket.

Finally Some coding and pushing, I felt well happy as I pushed my first piece of code( technically not mine ) but still I pushed it. It was there on the repo and now to start working on the Blender client and making a socket connection between the client and server.

I studied socket programming in C at my college, but never really tried it with python. I studied a bit about it and understood it was the same except some syntax. But my task was a bit harder as my client wasn’t a normal TCP python client it was a blender client(bpy). So I had to study a bit of blender scripting through its python API called bpy. I was very pleased as I got to learn two new things and I guess this is what GSoC is about.

I learned, studied, coded and finally was able to establish a TCP socket connection between the blender client and HMC server. Pushed the code to my repo and Felt sattisfied

1 Week down!

Hello Everyone, It’s been over a week since the coding period began and what a week it has been. It started off slowly, I was overwhelmed, nervous and at the same time very excited to begin my Project. The Oculus Rift had been giving problems every time I ran a blender file on it ( it even crashed my Ubuntu  15.10 system), so I decided to switch to 14.04LTS and give it a shot. Well, that was a very bad decision.As it turns out my laptop didn’t take it well, 14.04 couldn’t configure my wifi and most importantly Nvidia GPU drivers and it was giving all sorts of error on manually installing these drivers. So in short I wasted a day and a half just setting up. I know that the community bonding period was for setting up the coding environment, but heck I wasn’t sure such errors will pop up. However, I was back to 15.10 and my Laptop was running smoothly now.

For the Oculus part, I called up my mentor and asked him If I could do the python server part of my proposal now and later move onto Oculus. He agreed with my decision as he himself said that they are working on the Oculus and Blender integration at their end and will let me know once that is done.

So after wasting almost 3 days , my confidence was a bit low and I was completely out of my depth. My mentor said that this happens during the initial period and told me to focus on the other part of the project. I realised that python was my strength and decided to wholly focus on the server and get it running as soon as possible.

Every day is a new fresh start and I hope to conquer it. 🙂