Close

Project Update 2

A project log for Mixed Reality Drone Racing

PC game to build virtual 3D racetracks in your house then race actual physical drones on them

vinayVinay 12/20/2022 at 13:570 Comments

What's New
Picking up from last time, my goal was to get the game to playable state, which if you remember: I had no flight controls and no camera feed. So for this update, my focus was on getting those 2 components implemented for the game.

Adding the flight controls was relatively straightforward. To make things easy, I wanted to use a generic PS3-type controller with some basic inputs setup for forward/lateral/vertical velocities, yaw rate, takeoff/landing and motor kill/reset. Luckily, Unity makes it really easy to support gamepad controllers, so I just setup a few callbacks listening for inputs which I then mapped to velocity commands and published over ROS. I then updated the crazyflie ros node to subscribe to these command messages which are then executed on the drone using the python cflib library. This works pretty seamlessly and feels responsive enough to fly with (after some tuning).

By contrast, adding the FPV video feed was a bit more involved and came with a bunch of questions:

To record video, I decided to use the smallest, cheapest off-the-shelf camera I could find, which ended up being this WolfWhoop WT03 FPV camera + transmitter on Amazon. It weighs about 5 grams, uses around 25 mW of power (at its lowest power setting) and works off of 3-5V input (so draws around 500mA current). It seemed like a good option because it was pretty light and low-enough on power consumption that it can be powered by the crazyflie's onboard lipo. Additionally, being an analog video transmitter meant it should be relatively low latency.

To receive the video, I needed an analog video receiver. I found the Skydroid 5.8G OTG Receiver on amazon for around $30 which could receive an analog video stream and output it as a standard webcam feed on my linux pc. The output webcam feed produced by this receiver is a sequence of 640x480 frames encoded as MJPEG (which is basically a sequence of frames which are rendered as JPEG images without and temporal/multi-frame compression).

To render this video feed into the game, I was looking for a quick solution and my main approach was to try and capture and render the video feed entirely in Unity using the WebCamTexture class. I found a fair bit of trouble with this approach, so for this demo I chose to just render the video feed outside of Unity using VLC player. I'm not very happy about this solution, but it worked enough to give me a feel for the game's playability.

Challenges and Concerns I had
There were a bunch of challenges in getting to this point:

Let's go through them:
Sourcing components - this was not very complicated, but did require doing a bit of homework to make sure that the camera met the power/weight constraints for the drone and that the receiver would be compatible with it (as well as my pc). This all seemed to work out though.

Adding the camera to the drone - this was again not super complicated but required some homework (I am also less confident in my soldering/hardware skills than I am in my software skills). I used the crazyflie's prototyping deck to solder in leads to the power supply and used a JST pin header to connect to the camera. To physically attach the camera to the drone, I just used electric tape as a quick-and-dirty solution (but it's on my todo list to build a 3D printed mount).

Getting the video feed into Unity - I feel like this was much more complicated that I was expecting (or than it should have been). As I mentioned in the sections above, I was primarily trying to use Unity to directly capture and render the video feed since I felt that was the quickest way to get a video feed into Unity. I wanted the stream to be rendered in Unity because eventually I want to overlay unity's virtual objects onto the video feed. I knew Unity definitely had support for capturing webcam streams, and I was able to get it working using one of my other web cameras. However, after a fair amount of debugging and testing I finally realized that the webcam interface for the Skydroid OTG receiver only outputs MJPEG-encoded streams (which Unity does not support) whereas most modern webcams use H.264/H.265 encoding (which Unity does support). To me this meant capturing the webcam feed directly in Unity was pretty much ruled out, which is why I settled for just rendering the video with VLC for now. Figuring this out is definitely one of my big goals for the next update though.

Understanding the video feed latency - camera latency is a big concern and is one of the main limiting factors on playability. From my initial flight tests, I could tell that there was a noticeable but not unreasonable amount of latency (the camera felt sufficiently responsive), but I wanted to try and quantify this a bit. As a reference point, I know that most FPV drone racers try to keep their camera latency under 40ms, and that anything over ~200ms becomes distracting and eventually unacceptable. I did some quick tests inspired this blog post, where I basically pointed the camera back at a display of the video feed, put a stopwatch next the display and took screenshots so I could see the real time and the time displayed in the video feed simultaneously. Here's an example of a captured screenshot. I sampled this a few times and saw the latency around the 100-130ms range. In the future, I may try to improve this, but for now I'll just keep an eye on this number.

Understanding the camera's impact on battery life - Another big concern of mine is the battery life of the drone once the camera feed has been added. I noticed myself getting anxious about how much battery life was remaining while flying (sort of like range anxiety). To help understand and quantify this for myself, I wrote a couple scripts to log the voltage and motor speeds during charging and discharging tests (you can see an example of the results in the following spreadsheet). Compared to the nominal 7 minute battery life of the crazyflie, with the fpv camera added, I was seeing around 3 minutes of continuous flight time (with reasonably aggressive motor inputs). This helped me understand what my expectations should be in terms of flight time, but I think I want to add some sort of battery level indicator into the game to help with this as well.

Lastly, playability concerns - My initial thoughts from my test flights brought up a few concerns around responsiveness (see latency above) and battery life anxiety (see battery life above). However, the biggest issue I faced in terms of playability though was the fact that I had 2 separate screens I needed to look at (the rendered game view and the fpv view). In fact, this was the reason I crashed in the flight test demo video above, because I was too focused on the game view and not paying any attention to the fpv view. To me this just emphasized the importance of integrating these 2 views together and made it clear that all the important information should be accessible from just focusing in one place.

What's Next
For next steps, my main focus is on unifying the fpv and rendered game views and removing the need for 2 separate displays. Ideally, I would like to overlay the rendered game objects onto the fpv feed to create mixed reality display, but I'm not sure how easy that will be without trying. I'd also like make some additional quality-of-life/playability improvements including 3D printing a camera mount + bumper guards for the drone and further tuning the input controls. Lastly, I'd like to publish the code for others to use. This might take me some time, but for the next update hopefully I'll have code to share, a parts list and build instructions so you can try this thing out for yourself!


Discussions