Skip to content

Realtime 3D avatar system and cross-platform rendering engine built from scratch for web3 interoperability and the open metaverse.

License

Notifications You must be signed in to change notification settings

zhbhun/alter-core

 
 

Repository files navigation

DISCONTINUED!

Due to changes on the horizon, we’ve disabled new API keys generation in the Alter Studio. This means we are pausing new partner signups. Existing partners will have access to the assets SDK until August 31, 2022 after which access will be discontinued.

We apologize for any interruptions to your development.

As always, we are grateful for your trust and support.

Core by Alter

Core by Alter is a cross-platform core tech and an SDK powering Alter SDK and consisting of a real-time 3D avatar system and facial motion capture. It's built from scratch for web3 interoperability and the open metaverse. Easily pipe avatars into your game, app or website. It just works. Check out the included code samples to learn how to get started. Try live web demo or TestFlight.

Please star us ⭐⭐⭐ on GitHub—it motivates us a lot!

📋 Table of Content

🤓 Tech Specs

🚉 Supported Platforms

✨ Avatar Formats

  • Head only
  • A bust with clothing
  • A bust with clothing and background (Soon)
  • Accessories only (for e.g. AR filters) (Soon)
  • Full body (Soon)

🤪 Motion Capture

✨ Features

  • 42 tracked facial expressions via blendshapes
  • Eye tracking including eye gaze vector
  • Tongue tracking
  • Light & fast, just 3MB ML model size
  • ≤ ±50° pitch, ≤ ±40° yaw and ≤ ±30° roll tracking coverage
  • 3D reprojection to input photo/video
  • Platform-suited API and packaging with internal optimizations
  • Simultaneous back and front camera support
  • Powered by mocap4face

🤳 Input

  • Any webcam
  • Photo
  • Video
  • Audio

📦 Output

  • ARKit-compatible blendshapes
  • Head position and scale in 2D and 3D
  • Head rotation in world coordinates
  • Eye tracking including eye gaze vector
  • 3D reprojection to the input photo/video
  • Tongue tracking

⚡ Performance

  • 50 FPS on Pixel 4
  • 60 FPS on iPhone SE (1st gen)
  • 90 FPS on iPhone X or newer

💿 Installation

Prerequisites

Register in Alter Studio to get a unique key to access avatar data from our servers.

See our example code to see where to put the key. Look for "YOUR-API-KEY-HERE".

iOS

To run the example, simply open the attached Xcode project and run it on your iPhone or iPad.

Do not forget to get your API key at studio.alter.xyz and paste it into the code. Look for "YOUR-API-KEY-HERE".

SwiftPackage Installation

Add this repository as a dependency to your Package.swift or Xcode Project.

Manual iOS Installation

Download the AlterCore.xcframework from this repository and drag&drop it into your Xcode Project.

Android

To run the example, open the android-example project in Android Studio and run it on your Android phone.

Do not forget to get your API key at studio.alter.xyz and paste it into the code. Look for "YOUR-API-KEY-HERE".

Gradle/Maven Installation for Android

Add this repository to your Gradle repositories in build.gradle:

repositories {
    // ...
    maven {
        name = "Alter"
        url = uri("https://facemoji.jfrog.io/artifactory/default-maven-local/")
    }
    // ...
}

// ...
dependencies {
    implementation "alter:alter-core:0.15.0"
}

Browser/Javascript

To run one of the provided examples, go to the js-example project and use npm install and npm run {exampleName} (e.g. npm run renderAvatar or npm run deSerialization). See package.json for list of all examples.

Do not forget to get your API key at studio.alter.xyz and paste it into the code. Look for "YOUR-API-KEY-HERE".

NPM Installation

Install the dependency via npm or yarn command.

npm install @0xalter/alter-core@0.15.0

If you are using a bundler (such as Webpack), make sure to copy the assets from @0xalter/alter-core to your serving directory. See our Webpack config for an example of what needs to be copied.

📄 License

This library is provided under the Alter SDK License Agreement. The sample code in this repository is provided under the Alter Samples License.

This library uses open source software, see the list of our OSS dependencies and license notices.

🚀 Use Cases

Any app or game experience that uses an avatar as a profile picture or for character animations. The only limit is your imagination.

  • Audio-only chat apps
  • Next-gen profile pics
  • Live avatar experiences
  • Snapchat-like lenses
  • AR experiences
  • VTubing apps
  • Live streaming apps
  • Face filters
  • Personalized stickers
  • AR games with facial triggers
  • Role-playing games

Known Limitations

This is an alpha release software—we are still ironing out bugs, adding new features and changing the data:

  • Item names within an Avatar Matrix can change
  • The SDK is still not 100 % thread safe and race conditions or memory leaks can occur rarely
  • Documentation is very sparse, make sure to join our Discord or file an issue if you encounter problems

❤️ Links

About

Realtime 3D avatar system and cross-platform rendering engine built from scratch for web3 interoperability and the open metaverse.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Metal 62.1%
  • Objective-C 11.3%
  • TypeScript 8.6%
  • Swift 8.0%
  • Kotlin 7.3%
  • JavaScript 1.3%
  • Other 1.4%