August 8, 2022

DeepAR for Flutter - The Journey 💙

DeepAR offers you fun AR filters and 3D masks that integrate seamlessly into your mobile application and allow your users to add AR effects to their video calls, capture videos with flashy sunglasses, and much more.

Up until now, we have officially supported native Android, iOS and Web SDKs. It is very exciting to announce official support for the Flutter as well!

Insight into the journey 🏁

First version of DeepAR Flutter Plugin supports all the major features that most of our users need (more advance stuff coming later):

  1. Live camera preview of filters, masks and effects rendered on regular widgets.
  2. Allow users to take screenshots, record videos, switch filters, and more - right from the plugin.

The natural choice was to turn to Flutter Platform Channels and build a wrapper around existing DeepAR Android and iOS SDKs.

For anyone unfamiliar, Flutter Platform Channels provides APIs to connect our Flutter code with native Android and iOS code. These are used to build any plugins that depend upon native functionalities like playing audio and connecting to Bluetooth just to name a few.

While Flutter Platform Channels allowed us to call native DeepAR methods, they do bring in an element of latency due to the transfer of data from Dart → C++ → Java/Swift and then back. It also increases as the size of data passed in arguments becomes bigger.

This latency doesn’t lead to any issues for methods that call native methods sporadically - like taking a screenshot or switching a filter. It can however be a bottleneck for methods that require passing streamed data with large arguments which in our case was providing the live camera frames to DeepAR for processing and later rendering as a widget.

Live Camera Feed (30FPS): The challenge 🏃🏻

During our first attempt, we sought to build upon the official camera plugin since it provides users with advanced features like zoom and focus out of the box. The plugin also exposed the method startImageStream()which provided us with live frames which we can then pass to our native implementation for processing. Easy right?

While this did work, we observed a huge drop in app performance along with high memory usage. The reason was we didn’t take into account the latency and high memory consumption of processing huge bytes of image data and transferring them to DeepAR native methods via the platform bridge.

Upon using Flutter Dev Tools for inspection, we found out that a lot of resources are being consumed to appending our Uint8List image frames into JSON,  encoding it to ByteData and the decoding it back in the native code . But there is just the right optimization for this:

Platform channels offer multiple codecs to communicate between platforms. The StandardMessageCodec worked with JSON which led to unnecessary encoding. Lucky for us, Flutter also provides a BinaryCodec that works directly with byte data which was exactly what we needed.

Post the optimization, the performance significantly improved and we were able to smoothly run and render the frames on a Google Pixel 4A. However the memory consumption was still significant. Running the plugin on lower-end devices with 2GB of RAM, the performance wasn’t satisfactory and the app lagged. We hit a dead end with no further optimization possible and had to think of a different way.

Writing our own Camera Code 📸

To avoid the bottleneck of marshalling camera frames via Flutter Platform Channels, we decided to write our own native implementation of the camera and pair it directly with DeepAR. As expected, this worked beautifully.

We used CameraX and CameraController respectively on Android and iOS. To render the frames as Flutter Widget, we had two options:

  1. Native Views (Used by Google Maps)
  2. Flutter Texture (Used by Camera)

DeepAR uses OpenGL to render its output to an Android surface. FlutterTexture widget can be configured to use an OpenGL image source which is quite more performant then native views. Hence we decided to work with the texture.

The implementation on Android was straightforward and worked smoothly. However, on iOS, FlutterTexture doesn’t expose a direct window (surface) for OpenGL to render to. Instead, the content to be rendered needs to come from the CVPixelBuffer.

DeepAR when configured to produce CVPixelBuffers has additional overhead in terms of CPU and memory usage. This posed a performance issue on older iPhone devices.

To combat this, we decided to use the Native View for the iOS implementation. This resulted in significant performance improvement and we were able to get a decent enough performance even on an iPhone 7. That’s the dream, right!

Finishing it up ⛳

From here on things were comparatively simpler, we merged both the native implementations using a dart wrapper and added functions to take screenshots and record videos. We exposed a DeepArController class with methods similar to the default camera package to keep it simple for our fellow developers.

Please check out the plugin on pub.dev with the source code here. We are accepting pull requests from the community for fixes and improvements. We also encourage you to file issues on our GitHub repository for any problems faced.

Made with love 💙

Get the latest from DeepAR

We write about AR case studies, insights and the newest AR tech we're creating.

Success! 🎉 You have been subscribed to our newsletter.
Oops! Something went wrong. Please try again.
External link

Ready to start using DeepAR?

Create an account for free and integrate the SDK within minutes. We even have a load of prebuilt integrations to make your life easier. Or if you prefer, get in touch with us over here.

Woman with hearts AR effect above her head - she's in LUURVEExcited man with fireworks AR filter above his head
Woman vomiting a rainbow. At least there will be a pot of gold at the end...or is that sweetcorn?Woman with heart AR effect over her eyes. She's also in luuurve.