The goal of Virtualpy is to create a Python library that is able to easily create comfortable VR experiences. VR requires a very high framerate, meaning the positions of objects in the scene need to be updated very frequently, however Python code, especially when doing a simulation of higher complexity, can much more easily have trouble with the 75fps update rate needed to animate at that frequency. By introducing approximately one frame of latency between when each frame is described by the Python library and when that frame is shown on screen it is possible to interpolate between the frame last shown and the frame to be shown in order to create intermediate frames. This allows the underlying rendering code to maintain headtracking and animation at a full framerate even while the python code that describes the frames runs much slower.
Team: Matthew Hoffman (mfhoffm2) and Victor Ge (vge2)
The Python library itself is a C extension, written in C++ and compiled into a .dll file that Python recognizes as a module. Details about extending python with C/C++ can be found here: https://docs.python.org/3.4/extending/extending.html. The C++ code then used DirectX for its rendering API and the official Oculus SDK for head tracking and Direct to Rift rendering.
The challenge for this component had two major parts. First, the rendering engine had to be written, and it needed to be rather adaptable to readily accept configuration of whatever the Python code needed to decide, which involved rather generic pipelines for things like VertexBuffer or ConstantBuffer creation. The second major challenge was in writing threading code to manage the communication between the Python code and the rendering code. The goal was to have the rendering code never need to block for input from the Python code so that it doesn't cause any stuttering in the framerate or head tracking. This required writing essentially a double buffered queue of frame states
Beyond the Python extension itself, we also wrote actual python code that gets dynamically loaded into the library, as well as example Python scripts that use our code. This includes things like classes to represent all the conceptual objects like Quaternions or Models that are stored internally as just ID numbers. It also includes higher level helper code like a model loader that takes .obj files and outputs virtualpy models.
Our project turned out to work pretty well. The major goals were achieved, we got a demo with Python code updating only once every two seconds that looks rather nice. The head tracking is low latency and the framerate is rather smooth. As expected, we pay the cost of high latency on python input, so the keyboard movement done through python is has rather notable lag, but that wasn't the focus. Also, to actually leverage Python's strengths, we created a demo that is able to dynamically reload its update function from the source file, allowing the program to change in real time. This turned out to be pretty neat and fun to play with, and is a good way to show off hte rapid prototyping possibilities that being able to work in Python opens up.
This is the code that can compile into the python module. Currently it is only set up as a MSVC project, and I've been testing it with a python instance compiled with the same compiler. Trying to get it to work from scratch on your machine may prove troublesome at the moment. Especially since the project is hardcoded to use my Oculus SDK and DirectX SDK directories for compiling. I was told this was okay, if you need help setting it up let me know. As for actual example python code, setup.py is the best and widest reaching example, showing off most of the functionality.