Skip to content

neuralize-ai/edgerunner-android

Repository files navigation



Edgerunner Android

Kotlin bindings for Edgerunner

💡 Introduction

Edgerunner Android provides a Kotlin wrapper for the Edgerunner AI inference library. This library is a work in progress, currently offering inference of tflite models on CPU. Support for GPU and vendor-specific NPU inference will follow incrementally along with various other inference engines. See Edgerunner for more details.

The wrapper logic and public Kotlin classes are found in the edgerunner Android library.

🛠 Installation

The library will soon be published to Maven Central.

🕹 Usage

An example image classification app is bundled with the project. See imageclassifier.kt for edgerunner API usage.

In general, the library can be used as follows;

import com.neuralize.edgerunner.Model

// ...

/* read model file into a ByteBuffer -> modelBuffer */
// ...

val model = Model(modelBuffer.asReadOnlyBuffer())

/* ByteBuffer, direct access to input buffer for model inference */
val inputBuffer = model.getInput(0)?.getBuffer() ?: /* handle error */

/* write input to `inputBuffer` */
// ...

val executionStatus = model.execute()


val outputBuffer = model.getOutput(0)?.getBuffer() ?: /* handle error */

/* interpret output */
// ...

The full API for Model and Tensor can be found in Model.kt and Tensor.kt respectively.

📜 Licensing

See the LICENSING document.