Unity ARKit Face Animation in Real-Time with the iPhone X
Interested in doing Live Facial Animation in Unity with your iPhone X? Take a look at this complete tutorial and download our free resources - Unity project, 3D character and iPhone X animation app - to create your own animations thanks to the Unity ARKit Remote and the 52 ARKit blendshapes!
Want to record Facial Animations live with the Unity ARKit Plugin and your iPhone X? Check out this complete tutorial using ARKit shapes and Face Tracking.
The iPhone X Face Tracking, a major breakthrough for accessible motion capture
As you might know by now, the iPhone X and later models can be used as motion capture devices for facial animation thanks to their front facing TrueDepth camera and the ARKit Face Tracking. This relatively new feature makes 3D facial animation more accessible than ever thanks to a new, affordable and easy to use combination of hardware and software. Thanks to this solution, studios and creators who were previously unable to get their hands on this type of technology can now integrate facial animation in their projects. The ARKit Face Tracking might very well lead to the creation of new production pipelines, and even be the future of mocap for some!
This page intends to present a global overview of performance capture in Unity using the iPhone X and latest iOS devices able to run all ARKit 2 Face Tracking features. You will also find a step by step tutorial as well as free resources to show the extent of the ARKit motion capture capacities and how to use them with Unity’s ARKit Remote, to stream your animation and record it live in Unity!
- About Polywink’s “Animation for iPhone X” Service
- Using the iPhone X with Unity ARKit Remote for live facial animation
- Unity tutorial to animate your own characters in real-time with an iPhone X
- Phase 1 - Install our Polywink app on your iPhone X
- Phase 2 - Connecting your iPhone X to ARKit Remote
About Polywink’s “Animation for iPhone X” Solution
Polywink strives to make Facial Animation accessible to everyone. We’re proud to think we contribute to sparkle creativity not only among large productions but especially among smaller indie games, which is why we’re always on the lookout for new flexible face animations techniques. The whole team was quite excited by the release of the iPhone X and we immediately wanted to explore this wonderful little device at its full capacities.
That’s the reason why we’ve released our Animation for iPhone X, a solution taking advantage of the iPhone X’s face tracking features. Animation for iPhone X automatically generates the specific set of 52 blendshapes required for ARKit facial animation, making it an easy and accessible service to quickly animate any 3D characters with an iPhone X. Our solution works for any character’s morphology, from scanned heads to photorealistic 3D models or cartoonish characters and preserves the user's topology. Simply upload your 3D model on our platform and you will receive a ready to use model within 24 hours! Moreover, the delivered expressions are perfectly adapted to the specific topology and UV of your 3D model. Don’t hesitate to download a sample if you want to test our services before buying them!
Using the iPhone X Facial Real-Time Animation with Unity ARKit Remote
There are multiple ways to animate your 3D characters with the iPhone X, on of them being an app called Face Cap that you can use to record your facial motion capture data with the iPhone X and export it in FBX format.
However, if you want to stream your animation live to Unity, the Unity ARKit Plugin allows anyone to animate their 3D characters in real-time with a mere iPhone X and save the animation data afterwards. This project enables developers to animate 3D models live, directly in the Unity Editor using the ARKit Face tracking potential. One of the bottlenecks here is that each developer must first create an app that sends the iPhone X data over to Unity; but you’re in luck, as Polywink has already created that iPhone X animation app, which you can download for free if you scroll down below!
This project aims at delivering an animatable 3D head model with Polywink blendshapes directly into Unity in order to use your iPhone X as an intuitive face animation tool. Thanks to this plugin, you will be able to capture facial expressions directly in your Unity Scene through a streamlined process. Discover a new animation pipeline for your 3D characters!
Unity tutorial to animate your own characters in real-time with an iPhone X
The following solution is compatible with Unity 2017.4 and later versions.
→ What do I need in order for this solution to work ?
- Unity v2017.4+
- An Apple iOS device that supports the ARKit face tracking (iPhone X, or Later)
Phase 1 - Install our Polywink Face Animation app on your iPhone X
For the time being, we’re only testing this app on TestFlight, which is a way for iOS devices to release beta software. To be informed about its development, you can follow us on Facebook!
- Download the TestFlight app to access our beta
- Open the following link on your iPhone : https://testflight.apple.com/join/B2dtmUe7
- Install the Polywink app and follow the steps described by TestFlight
Phase 2 - Connecting your iPhone X to ARKit Remote
You will then need a Unity project wich contains your models and the corresponding 52 ARKit blendshapes. If you don’t already have them, you can either sculpt them manually (good luck with that) or simply order our Animation for iPhone X service on Polywink! You’ll receive a Unity project ready for animation with everything properly setup.
1. Opening a scene in the Unity Editor:
- Option A: If you’re working with our Unity solution for iPhoneX and you’ve received an Unity project already set-up with your 3D model, packed with an iPhone X app: Simply open the Unity project contained in your package
- Option B: If you’re working with your own blendshapes, download the project here and open it in Unity as you normally would.
2. Make sure your iPhone X is connected to the same wireless network as your Mac/PC
Please note that If you’re working on Mac, you should connect your device directly via a USB Cable.
3. Select the following scene: Assets/UnityARKitPlugin/Examples/FaceTracking/FaceBlendshapeScenes
4. Launch the Polywink app on the device.
The screen should read “Waiting for connection...”.
5. On the Unity Editor of your Mac, click on the “Console” tab, Editor, and select the iPhone you’ve plugged via USB.
6. Press Play
7. Click on “Start Remote ARKit Face Tracking Session” in the “Game” window.
Best practices for the iPhone X performance capture
Here are a few best practises to use the ARKit iPhone X for Facial Animation :
- You should use this app in a brightly lit room
- Make sure your whole face is visible (For example tie your hair up in a bun/ponytail) and do not put your hands between your face and the 3D camera.
- This solution will work best if you can find a stand for your iPhone X or if your place it down. In any case it should be still while recording
- The range of the iPhone X is limited: you shouldn’t move too close or too far from the 3D camera.
- With that same idea in mind, don't look too far down, up or sideways.
- If you plan on recording for longer than 3 minutes straight, you might want to consider using a cloud service such as dropbox as the exported files might be too big to transfer.
- You also might want to post-process your recording by filtering out any noise, jitter or glitches for a seamless result.
Please note that the Polywink quality guarantees optimal outcomes. You might get less precise results if you create your own animation shapes.
Now have fun with this new tool and don’t forget to tag us on your tests, we’d love to see your characters and Polywink’s automatically generated blendshapes in action! If you have any questions, contact us directly through our website or social media and we’ll be happy to answer as soon as we can.