crust / Part 10 - Android

Rusty boat: attribution https://www.pexels.com/photo/a-rusty-boat-on-the-seashore-9816335/

In this article we will implement the crust-build pipeline for building out the Android target platform. This will let us run crust on Android mobile devices. We will:

Note: Our crust-build implementation for Android will be cross platform - meaning you will be able to do an Android target build on either Windows or MacOS.


Android and Rust

Out of all the crust target platforms we will be implementing, Android will be the most difficult. Although Android does have support for building C/C++ code into an application using the Android NDK, there is no direct integrated support for using Rust and the Android NDK cannot build Rust specifically.

This means that to compile and run Rust code on Android, we have to do most of the build toolchain work ourselves and manually stitch things together into the Android application. The good news is that Rust provides us hooks to do these things - though they can feel a bit messy. I suspect over the coming years, Google may decide to improve the ability to integrate Rust code into Android but I guess we’ll have to wait and see.

We will actually do something a bit different for the Android target - instead of us running build commands manually with our crust-build program to produce a final product, we will instead have our Android Studio project invoke crust-build automatically during it’s build pipeline. Something like this:

  1. Start the Android Studio project build
  2. At an early stage in the Android build, a custom Gradle task will invoke crust-build which will compile and link our Rust code for each Android target architecture using the Android NDK, producing a set of .so (shared object) libraries which Android can understand
  3. Collect the compiled .so files and place them into the Android build pipeline so they are bundled into the final Android application

There are (at this point in time) four main Android architectures that are commonly supported when building .so libraries: x86, x86_64, ARMv7A and ARMv8A.

I covered a C/C++ implementation in Setup Android app of the A Simple Triangle series - though it was using CMake which we won’t be using with Rust.

Setup Android Studio project

Before starting make sure you have a recent copy of Android Studio along with the appropriate versions of the Android SDK and NDK installed. No doubt over time things will change and tweaks might be needed but for this series I used the following versions:

Note: These steps are the same on Windows as MacOS - you should be able to follow along on either system.

Next, create a new Android Studio project with the Empty Activity template. Leaving the language as Java is fine as we will barely write any Android code so there isn’t much need to add Kotlin support.

Important: Create the project in a temporary directory - we will move it after it has been created.

Let the new project finish synchronising itself - this may take a while - then close Android Studio. Now rename the crust that your Android Studio project had created in your temporary location to android and move it into our workspace crust directory alongside crust-build and crust-main:

+ root
    + android
        + app
        + gradle
        - build.gradle
        - gradle.properties
        - gradlew
        - gradlew.bat
        - local.properties
        - settings.gradle

    + crust-build
    + crust-main

You may have a few other files or directories inside android but it doesn’t matter - open the project in Android Studio again.

Main activity

Edit the MainActivity.java class and replace it with:

package io.github.marcelbraghetto.crust;

import org.libsdl.app.SDLActivity;

public class MainActivity extends SDLActivity {
    @Override
    protected String[] getLibraries() {
        return new String[]{
                "hidapi",
                "SDL2",
                "SDL2_image",
                "crustlib"
        };
    }
}

Don’t worry about the syntax errors - our Android project doesn’t yet know about SDLActivity - it will come from the SDL library that we’ll prepare in our crust-build implementation. The SDLActivity superclass will bootstrap our Rust library code, which it will know about because we are including crustlib in the list of libraries to load.

You may recall from the start of the series that for Android and iOS we don’t actually run our main binary Rust artifact, instead we consume the Rust library artifact - which was named crustlib in crust-main/Cargo.toml:

[lib]
name = "crustlib"

Now delete the app/src/main/res/layout directory - we won’t actually need any UI layouts in our app, instead SDL will take control and draw directly to the graphics display.

Manifest and resources

Replace the content of AndroidManifest.xml with the following:

<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:tools="http://schemas.android.com/tools"
    package="io.github.marcelbraghetto.crust">

    <uses-feature android:glEsVersion="0x00020000" />
    <uses-feature
        android:name="android.hardware.touchscreen"
        android:required="false" />
    <uses-feature
        android:name="android.hardware.gamepad"
        android:required="false" />
    <uses-feature
        android:name="android.hardware.type.pc"
        android:required="false" />

    <application
        android:hardwareAccelerated="true"
        android:icon="@mipmap/ic_launcher"
        android:label="@string/app_name"
        android:roundIcon="@mipmap/ic_launcher_round"
        android:theme="@android:style/Theme.NoTitleBar.Fullscreen"
        tools:ignore="GoogleAppIndexingWarning">

        <activity
            android:name=".MainActivity"
            android:configChanges="keyboard|keyboardHidden|orientation|screenSize|uiMode"
            android:exported="true"
            android:screenOrientation="sensorLandscape">
            <intent-filter>
                <action android:name="android.intent.action.MAIN" />
                <category android:name="android.intent.category.LAUNCHER" />
            </intent-filter>
        </activity>
    </application>

</manifest>

We are declaring a few properties around the features that might be used and applying a full screen theme with no title bar so our app fills the entire display.

We also register to ignore the set of android:configChanges in our main activity so doing things like rotating the device doesn’t destroy our main activity and then recreate it (which is what would normally happen).

Now delete app/src/main/res/values/themes.xml, app/src/main/res/values/colors.xml and app/src/main/res/values-night as we won’t need any theming or colour definitions in our app.

We can also delete app/proguard-rules.pro as we won’t really have any need to optimise the Java code in our app (because there will hardly be any).

Tests

Delete app/src/test and app/src/androidTest - we won’t be writing any logical Android code and I doubt instrumented Android tests would work very well in crust.

Git ignore

There are a few things we don’t want to commit to version control as they are dynamically generated during a build - particularly anything related to symlinks which we’ll be using for some of the source directories. Edit .gitignore in the android directory with:

*.iml
.gradle
/local.properties
/.idea
.DS_Store
/build
/captures
.externalNativeBuild
.cxx
local.properties
/app/src/main/java/org
/app/src/main/assets
/app/src/main/jniLibs

Gradle

Replace the content in app/build.gradle with:

plugins {
    id 'com.android.application'
}

android {
    compileSdk 30
    ndkVersion '22.1.7171670'

    defaultConfig {
        applicationId "io.github.marcelbraghetto.crust"
        minSdk 21
        targetSdk 30
        versionCode 1
        versionName "1.0"
    }

    buildTypes {
        release {
            // Note: If you actually wanted to publish your app you should create a proper release
            // signing key but for easy testing of release builds we can take this approach to adopt
            // the debug signing key by default.
            signingConfig signingConfigs.debug
            minifyEnabled false
        }
    }
}

def rustBuild = tasks.register("rustBuild") {
    doLast {
        exec {
            environment << ['ANDROID_NDK_ROOT': "${android.ndkDirectory}"]
            workingDir file("$projectDir/../../crust-build").absolutePath
            commandLine 'cargo', 'run', '--', '--target', 'android', '--variant', gradle.startParameter.taskNames[0].contains('assembleRelease') ? 'release' : 'debug'
        }
    }
}

preBuild.dependsOn rustBuild

The bottom section is our custom Gradle task which will cause our crust-build code to be invoked. If you look at the commandLine argument you can see that we are essentially running:

cargo run -- --target android --variant <debug/release>

You can also see that we have declared ndkVersion '22.1.7171670' - this helps our Android app build pipeline to be aware of which NDK will be used during a build and lets us reference the android.ndkDirectory property to find out where it is in the file system. This is fed into an environment variable named ANDROID_NDK_ROOT - we will read this environment variable in our crust-build code to know where the NDK is.

The workingDir property ensures we are in the crust-build directory and we use the following code to figure out if we are doing a debug or release build:

gradle.startParameter.taskNames[0].contains('assembleRelease')

Note: Interrogating the Gradle task name of the current execution to figure out if we are doing a release build might seem a bit odd if you have done Android development before, but we have to do it like this because preBuild happens before Gradle has fully configured the kind of build we are trying to do.

The key part of making our crust-build code run is the addition of the rustBuild Gradle task dependency to the preBuild stage. This causes it to be run as a precursor to the regular Android build - meaning that by the time the normal Android build process gets to it’s build stages, we will already have compiled and prepared our Rust code into it:

preBuild.dependsOn rustBuild

When you are done, your Android project should look something like this:

Run the Android project and although you will get a compilation error about the main activity, if you look in the build output you will see that our crust-build code actually executed:

> Task :app:rustBuild
   Compiling crust-build v1.0.0 (<snip>\crust-build)
    Finished dev [unoptimized + debuginfo] target(s) in 1.76s
     Running `target\debug\crust-build.exe --target android --variant debug`
[ print_summary ] ---------------------------------------------
[ print_summary ] Assets dir:          "<snip>\\crust-main\\assets"
[ print_summary ] Working dir:         "<snip>\\android\\.rust-build"
[ print_summary ] Rust build dir:      "<snip>\\android\\.rust-build\\rust"
[ print_summary ] Variant:             Debug
[ print_summary ] Target home dir:     "<snip>\\android"
[ print_summary ] Main source dir:     "<snip>\\crust-main"
[ print_summary ] ---------------------------------------------

Cool huh? That is actually all we need to do in the Android Studio project - the rest will happen in our crust-build implementation of the Android target.

NDK path and Rust toolchains

The first thing we’ll do is update android.rs in our crust-build project.

NDK location

We can start by fetching the value of the Android NDK location. Add the following to the build method (you will need to edit your use imports as we go):

pub fn build(context: &Context) -> FailableUnit {
    context.print_summary();

    let ndk_dir = PathBuf::from(std::env::var("ANDROID_NDK_ROOT")?);
    logs::out(log_tag!(), &format!("Using Android NDK: {:?}", &ndk_dir));

    Ok(())
}

Go back to Android Studio and run the project again, this time you will see the following build output, showing that we now have the location of the Android NDK inside crust-build:

[ build ] Using Android NDK: "C:\\Users\\Joining\\AppData\\Local\\Android\\Sdk\\ndk\\22.1.7171670"

Android architectures

Next up we are going to create a new enum to help us model the Android architectures we’ll be building. Each architecture will have a bunch of associated tools that we’ll use during the build. Add the following:

enum Architecture {
    ARMv8A,
    ARMv7A,
    X86,
    X86_64,
}

impl Architecture {
    fn jni_name(&self) -> String {
        String::from(match self {
            Architecture::ARMv8A => "arm64-v8a",
            Architecture::ARMv7A => "armeabi-v7a",
            Architecture::X86 => "x86",
            Architecture::X86_64 => "x86_64",
        })
    }

    fn ndk_triple(&self) -> String {
        String::from(match self {
            Architecture::ARMv8A => "aarch64-linux-android",
            Architecture::ARMv7A => "armv7a-linux-androideabi",
            Architecture::X86 => "i686-linux-android",
            Architecture::X86_64 => "x86_64-linux-android",
        })
    }

    fn rust_triple(&self) -> String {
        String::from(match self {
            Architecture::ARMv8A => "aarch64-linux-android",
            Architecture::ARMv7A => "armv7-linux-androideabi",
            Architecture::X86 => "i686-linux-android",
            Architecture::X86_64 => "x86_64-linux-android",
        })
    }

    fn strip_triple(&self) -> String {
        String::from(match self {
            Architecture::ARMv8A => "aarch64-linux-android",
            Architecture::ARMv7A => "arm-linux-androideabi",
            Architecture::X86 => "i686-linux-android",
            Architecture::X86_64 => "x86_64-linux-android",
        })
    }
}

The meanings of these are:

Most of these mappings almost seem the same but there a few subtle differences - particularly around the ARMv7A architecture which has a bit of inconsistency around the tool names.

Install Rust toolchains

Next add a new method to install the required Rust toolchains we will need to compile for each of the Android architectures - note how we use the new Architecture enum for the name of the Rust triples to install:

fn install_rust_dependencies() -> FailableUnit {
    logs::out(log_tag!(), "Installing Android Rust targets ...");
    scripts::run(&Script::new(&format!(
        "rustup target add {} {} {} {}",
        Architecture::ARMv8A.rust_triple(),
        Architecture::ARMv7A.rust_triple(),
        Architecture::X86.rust_triple(),
        Architecture::X86_64.rust_triple(),
    )))
}

Update the build method to install the Rust dependencies:

install_rust_dependencies()?;

Hop back and run the Android Studio project again and you should see it downloading and installing the Rust toolchains for each Android architecture.

SDL libraries

When we use the SDL libraries in Android, we need to do a few things:

  1. Expose the SDLActivity and other supporting SDL Java classes to our Android project
  2. Build the SDL libraries into the .so format for each Android architecture and bundle the resulting .so files into our Android project - the SDL library source code contains Android projects that we can compile ourselves to produce these

Add the following constants with the download links for SDL and the names of the directories to download them into in our .rust-build working directory structure:

const SDL2_SOURCE_URL: &str = "https://www.libsdl.org/release/SDL2-2.0.14.zip";
const SDL2_SOURCE_DIR: &str = "SDL";

const SDL2_IMAGE_SOURCE_URL: &str = "https://www.libsdl.org/projects/SDL_image/release/SDL2_image-2.0.5.zip";
const SDL2_IMAGE_SOURCE_DIR: &str = "SDL2_image";

Note: It may seem a bit odd that the SDL2 library is going into SDL rather than SDL2, but the SDL2_image Android project expects to find the SDL2 source code in a sibling SDL directory. I assume this is due to some historical reason by the authors.

To create the SDL .so files, we need to do NDK builds for SDL2 and SDL2 Image. We’ll do these builds through an intermediate NDK project so the outputs of both libraries end up in the same place. We’ll put the intermediate NDK project in .rust-build/ndk. Add a ndk_project_dir method to help the rest of our build code know where to find it:

fn ndk_project_dir(context: &Context) -> PathBuf {
    context.working_dir.join("ndk")
}

When an NDK build is complete, the resulting .so files will end up in .rust-build/ndk/libs - add a compiled_libs_dir method so our build code knows where to look for them:

fn compiled_libs_dir(context: &Context) -> PathBuf {
    ndk_project_dir(context).join("libs")
}

Setup SDL

Add the following method to setup SDL:

fn setup_sdl2(context: &Context, ndk_dir: &PathBuf) -> FailableUnit {
    let ndk_project_dir = ndk_project_dir(context);

    io::create_dir(&ndk_project_dir)?;

    remote_zips::fetch(SDL2_SOURCE_URL, SDL2_SOURCE_DIR, &ndk_project_dir)?;
    remote_zips::fetch(SDL2_IMAGE_SOURCE_URL, SDL2_IMAGE_SOURCE_DIR, &ndk_project_dir)?;

    let sdl_java_source_symlink = context.target_home_dir.join("app").join("src").join("main").join("java").join("org");
    io::create_symlink(
        &ndk_project_dir
            .join(SDL2_SOURCE_DIR)
            .join("android-project")
            .join("app")
            .join("src")
            .join("main")
            .join("java")
            .join("org"),
        &sdl_java_source_symlink,
        &context.target_home_dir,
    )?;

    io::write_string("include $(call all-subdir-makefiles)", &ndk_project_dir.join("Android.mk"))?;
    logs::out(log_tag!(), "Compiling SDL NDK libraries (this may take a while!) ...");
    io::delete(&compiled_libs_dir(context))?;

    scripts::run(
        &Script::new(&format!(
            "{:?} NDK_PROJECT_PATH={:?} APP_BUILD_SCRIPT={:?} APP_PLATFORM=android-21 APP_STL=c++_shared APP_ABI=all",
            &ndk_dir.join("ndk-build"),
            &ndk_project_dir,
            &ndk_project_dir.join("Android.mk"),
        ))
        .working_dir(&ndk_project_dir),
    )
}

Let’s walk through this - first up we ensure that we have a .rust-build/ndk directory:

let ndk_project_dir = ndk_project_dir(context);

io::create_dir(&ndk_project_dir)?;

We then download and unzip the SDL2 libraries:

remote_zips::fetch(SDL2_SOURCE_URL, SDL2_SOURCE_DIR, &ndk_project_dir)?;
remote_zips::fetch(SDL2_IMAGE_SOURCE_URL, SDL2_IMAGE_SOURCE_DIR, &ndk_project_dir)?;

Notice that we are downloading them into the .rust-build/ndk directory, rather than into .rust-build. We would end up with:

Next up we are fixing the problem of our Android application not knowing about the SDLActivity class. We do this by setting up a symlink from the actual Java source code in the downloaded SDL2 library into our Android application. This then makes the SDL source classes available in Android:

let sdl_java_source_symlink = context.target_home_dir.join("app").join("src").join("main").join("java").join("org");
io::create_symlink(
    &ndk_project_dir
        .join(SDL2_SOURCE_DIR)
        .join("android-project")
        .join("app")
        .join("src")
        .join("main")
        .join("java")
        .join("org"),
    &sdl_java_source_symlink,
    &context.target_home_dir,
)?;

Next up we write a new text file into the .rust-build/ndk directory named Android.mk. When we ask the NDK to build this file, it will recurse through all subdirectories looking for Android NDK projects and build them. For us this means it will find NDK projects in the SDL and SDL2_image directories:

io::write_string("include $(call all-subdir-makefiles)", &ndk_project_dir.join("Android.mk"))?;
logs::out(log_tag!(), "Compiling SDL NDK libraries (this may take a while!) ...");

We also need to delete the compiled libs directory as there is a nasty bug on Windows where if there was already compiled output from a prior build, it will fail:

io::delete(&compiled_libs_dir(context))?;

Finally, we invoke the Android NDK, asking it to build the Android.mk file in the .rust-build/ndk directory. This will in turn recurse and build both the SDL and SDL2_image Android projects, collecting their outputs into .rust-build/ndk/libs:

scripts::run(
    &Script::new(&format!(
        "{:?} NDK_PROJECT_PATH={:?} APP_BUILD_SCRIPT={:?} APP_PLATFORM=android-21 APP_STL=c++_shared APP_ABI=all",
        &ndk_dir.join("ndk-build"),
        &ndk_project_dir,
        &ndk_project_dir.join("Android.mk"),
    ))
    .working_dir(&ndk_project_dir),
)

We are specifying the android-21 platform as it is our minimum supported Android version, asking to use the c++_shared standard template library and setting APP_ABI=all which will cause all four of the main architectures to be created (x86, x86_64, armv7a, armv8a).

If you go and run the Android Studio project again, you will see the SDL libraries download and a whole heap of NDK compilation happen (it may take a while the first time). If you are observing carefully you will also notice that each SDL library is compiled four times - once for each Android architecture.

Additionally you will see that we no longer have syntax errors in our MainActivity.java because the symlinked SDL source code is now visible to our project. If you look in the file system you will see something like:

+ root
    + android
        + .rust-build
            + ndk
                + libs
                    + arm64-v8a
                        - libc++_shared.so
                        - libhidapi.so
                        - libSDL2.so
                        - libSDL2_image.so
                    + armeabi-v7a
                        - libc++_shared.so
                        - libhidapi.so
                        - libSDL2.so
                        - libSDL2_image.so
                    + x86
                        - libc++_shared.so
                        - libhidapi.so
                        - libSDL2.so
                        - libSDL2_image.so
                    + x86_64
                        - libc++_shared.so
                        - libhidapi.so
                        - libSDL2.so
                        - libSDL2_image.so
                + obj
                + SDL
                + SDL2_image
                - Android.mk

Each of the Android architectures is represented in the libs directory. We will copy these .so files into our Android project later in our build code so they are bundled with the Android app but until then you will get an error like this because our main activity is trying to load the .so libraries but can’t find them:

Assets

Next up we will link our assets directory so it is bundled into the Android app. Android already has the concept of assets, and that is where SDL will look by default when performing file loading operations via RWOps. We’ll use a symlink to avoid unneccessary file copying during the Rust build stage - Android will actually take a copy of the symlinked assets when it bundles everything together. Add:

fn setup_assets(context: &Context) -> FailableUnit {
    let app_assets_dir = context.target_home_dir.join("app").join("src").join("main").join("assets");
    let app_assets_symlink_dir = app_assets_dir.join("assets");

    io::create_dir(&app_assets_dir)?;
    io::create_symlink(&context.assets_dir, &app_assets_symlink_dir, &context.target_home_dir)?;

    Ok(())
}

We will end up with android/app/src/main/assets/assets pointing at crust-main/assets.

Note: The double assets/assets is not a mistake - the parent assets directory is the special Android directory where you would put any asset files to include in your final app.

Update the build method to setup our assets: setup_assets(context)?;

Cargo manifest

Ok, so there is something we need to do at this point which is a bit icky. We want to build our crust-main project as a library (crustlib), however in order for Rust to build the library in an Android compatible format we need the library crate-type to be cdylib. At the moment we have it set to lib which just means a regular Rust library (look in Cargo.toml):

[lib]
name = "crustlib"
path = "src/lib.rs"
crate-type = ["lib"]

Unfortunately, at least at the time of authoring this series, there was no way to dynamically change the crate-type at build time. There is a long standing issue about this and the problems this causes for build targets such as Android and iOS and there is even an experimental setting under development to help solve it: https://doc.rust-lang.org/nightly/cargo/reference/unstable.html#crate-type.

I don’t want to use experimental nightly Rust versions to compile my code so our approach to this problem will be:

  1. Before compiling our Rust code, take a copy of the main Cargo.toml file and put it into the .rust-build directory.
  2. Update the copied file to change the crate-type and paths to find the associated source code to build.
  3. Compile our Rust code by pointing at the .rust-build/Cargo.toml instead of crust-main/Cargo.toml.

When the dynamic crate-type setting becomes stable in the Rust toolchain then we won’t need this workaround but I don’t know when that will be and I want to finish writing this series :)

To support us editing a copy of the Cargo.toml file, we will use a third party crate that can parse and make changes to toml files generally: https://docs.rs/toml_edit/0.2.0/toml_edit/. Update crust-build/Cargo.toml to include the toml_edit crate:

[package]
name = "crust-build"
version = "1.0.0"
authors = ["Marcel Braghetto"]
edition = "2021"
rust-version = "1.59.0"

[dependencies]
clap = "2.33.3"
reqwest = { version = "0.11", features = ["blocking"] }
tempfile = "3.2.0"
zip = "0.5.12"
toml_edit = "0.2.0"

We will actually need this for the iOS target as well so we’ll write a new core service component to help us. Before writing it though, we need to add the ability to write text to a file in our io.rs core service component. Edit crust-build/src/core/io.rs and add a new method:

pub fn read_string(path: &PathBuf) -> Failable<String> {
    Ok(std::fs::read_to_string(&path)?)
}

Now add crust-build/src/core/manifests.rs (don’t forget to add it to mod.rs):

use crate::{
    core::{
        failable_unit::FailableUnit,
        {context::Context, io, logs},
    },
    log_tag,
};

pub fn create(context: &Context, crate_type: &str) -> FailableUnit {
    logs::out(log_tag!(), "Creating custom Cargo.toml manifest ...");

    let main_source_dir = context.source_dir.join("src");
    let manifest_content = io::read_string(&context.source_dir.join("Cargo.toml"))?;
    let mut manifest = manifest_content.parse::<toml_edit::Document>()?;

    let lib_src = format!("{:?}", &main_source_dir.join("lib.rs"));
    manifest["lib"]["path"] = toml_edit::value(lib_src);

    let crate_types = manifest["lib"]["crate-type"].as_array_mut().ok_or("Field 'lib/crate-type' not found in manifest!")?;

    for i in 0..crate_types.iter().count() {
        crate_types.remove(i);
    }

    crate_types.push(crate_type).map_err(|_| "Failed to set manifest crate-type")?;

    let bin_src = format!("{:?}", &main_source_dir.join("bin.rs"));
    manifest["bin"]
        .as_array_of_tables_mut()
        .ok_or("Missing 'bin' manifest entry")?
        .get_mut(0)
        .ok_or("Missing 'bin' manifest element 0")?["path"] = toml_edit::value(bin_src);

    io::write_string(&manifest.to_string(), &context.working_dir.join("Cargo.toml"))
}

First off we will read in the content of crust-main/Cargo.toml:

logs::out(log_tag!(), "Creating custom Cargo.toml manifest ...");

let main_source_dir = context.source_dir.join("src");
let manifest_content = io::read_string(&context.source_dir.join("Cargo.toml"))?;

We then parse the content into a TOML document object via the toml_edit crate:

let mut manifest = manifest_content.parse::<toml_edit::Document>()?;

The manifest object is marked with mut so we can make changes to it, the first change is to replace the lib/path element with the full path to lib.rs:

let lib_src = format!("{:?}", &main_source_dir.join("lib.rs"));
manifest["lib"]["path"] = toml_edit::value(lib_src);

Next we will locate the existing crate-type element at lib/crate-type and delete all of its elements (crate-type is an array), then add our own crate-type in the appropriate place in the document with the crate_type argument passed into the create method. This is the key part to the whole problem of not being able to dynamically specify a crate type:

let crate_types = manifest["lib"]["crate-type"].as_array_mut().ok_or("Field 'lib/crate-type' not found in manifest!")?;

for i in 0..crate_types.iter().count() {
    crate_types.remove(i);
}

crate_types.push(crate_type).map_err(|_| "Failed to set manifest crate-type")?;

We need to update the source file path for the binary target too - similar to our library target. The only difference is in a Cargo manifest, the [[bin]] section is actually an array of binary targets - whereas the [lib] is just a single element (you can’t have more than 1 library in a Cargo manifest for some reason):

let bin_src = format!("{:?}", &main_source_dir.join("bin.rs"));
manifest["bin"]
    .as_array_of_tables_mut()
    .ok_or("Missing 'bin' manifest entry")?
    .get_mut(0)
    .ok_or("Missing 'bin' manifest element 0")?["path"] = toml_edit::value(bin_src);

Finally, we serialize out the edited TOML document object into android/.rust-build/Cargo.toml:

io::write_string(&manifest.to_string(), &context.working_dir.join("Cargo.toml"))

Set Android Cargo manifest

Ok, with the manifests core service available, go back to android.rs and add a new method to use it:

fn setup_cargo_manifest(context: &Context) -> FailableUnit {
    logs::out(log_tag!(), "Creating Android Cargo.toml file ...");
    manifests::create(context, "cdylib")
}

Notice we are passing cdylib which is the crate-type we need our Rust code to compile to for compatibility with Android.

Update build to call it:

setup_cargo_manifest(context)?;

Go back to Android Studio and run again, afterward you should now see android/.rust-build/Cargo.toml which looks like this:

[package]
name = "crust"
version = "1.0.0"
authors = ["Marcel Braghetto"]
edition = "2021"
rust-version = "1.59.0"

[lib]
name = "crustlib"
path = "C:\\<snip>\\crust-main\\src\\lib.rs"
crate-type = ["cdylib"]

[[bin]]
name = "crust"
path = "C:\\<snip>\\crust-main\\src\\bin.rs"

[dependencies]
libc = "0.2.88" # Note iOS targets won't compile without this.
gl = "0.14.0"
tobj = "2.0.3"
glm = "0.2.3"
sdl2-sys = "0.34.4"

[dependencies.sdl2]
version = "0.34.4"
default-features = false
features = ["use_mac_framework", "image"]

Note that the path elements of [lib] and [[bin]] are now absolute paths to the crust-main/src/* locations and the crate-type in [lib] is now ["cdylib"].

Compiling

Ok, this is the chunky bit - we need to compile our libcrust library into an .so file for each Android architecture. Even though Rust has toolchains for these architectures, it doesn’t know how to perform Android NDK operations so we need to help it along.

We’ll be making use of some special Cargo environment variables which influence Rust builds which you can read about here: https://doc.rust-lang.org/cargo/reference/environment-variables.html. In particular we will be configuring the archiver and the linker to point at the Android NDK tooling.

Add the following constant which represents our .so file:

const CRUST_SO_FILE_NAME: &str = "libcrustlib.so";

Now add a new method to compile our code - note that it takes the path to the Android NDK as an argument:

Note: The HashMap comes from std::collections::HashMap.

fn compile_rust_code(context: &Context, ndk_dir: &PathBuf) -> FailableUnit {
    let is_windows = cfg!(target_os = "windows");

    let ndk_toolchain_dir = ndk_dir
        .join("toolchains")
        .join("llvm")
        .join("prebuilt")
        .join(if is_windows { "windows-x86_64" } else { "darwin-x86_64" })
        .join("bin");

    logs::out(log_tag!(), &format!("Using NDK toolchain at: {:?}", &ndk_toolchain_dir));

    for architecture in &vec![
        Architecture::ARMv8A,
        Architecture::ARMv7A,
        Architecture::X86,
        Architecture::X86_64,
    ] {
        let rust_triple = architecture.rust_triple();
        let ndk_triple = architecture.ndk_triple();
        let cargo_rust_triple = rust_triple.to_uppercase().replace("-", "_");

        logs::out(log_tag!(), &format!("Compiling architecture: {:?}", &rust_triple));

        let mut environment = HashMap::new();

        environment.insert(
            format!("CARGO_TARGET_{}_AR", &cargo_rust_triple),
            ndk_toolchain_dir
                .join(if is_windows {
                    format!(r"{}-ar.exe", ndk_triple)
                } else {
                    format!(r"{}-ar", ndk_triple)
                })
                .display()
                .to_string(),
        );

        environment.insert(
            format!("CARGO_TARGET_{}_LINKER", &cargo_rust_triple),
            ndk_toolchain_dir
                .join(if is_windows {
                    format!(r"{}30-clang.cmd", ndk_triple)
                } else {
                    format!(r"{}30-clang", ndk_triple)
                })
                .display()
                .to_string(),
        );

        environment.insert(
            format!("CARGO_TARGET_{}_RUSTFLAGS", &cargo_rust_triple),
            format!(
                "-Clink-arg=-L{} -lc++_shared -lhidapi -lSDL2 -lSDL2_image",
                &compiled_libs_dir(context).join(architecture.jni_name()).display().to_string()
            ),
        );

        scripts::run(
            &Script::new(&format!(
                "cargo rustc {} --target-dir {:?} --lib --target {}",
                context.variant.rust_compiler_flag(),
                context.rust_build_dir,
                rust_triple,
            ))
            .environment(&environment)
            .working_dir(&context.working_dir),
        )?;

        let compiled_crust_so_path = context.rust_build_dir.join(rust_triple).join(context.variant.id()).join(CRUST_SO_FILE_NAME);

        if context.variant == Variant::Release {
            logs::out(log_tag!(), &format!("Stripping .so library: {:?}", &compiled_crust_so_path));
            let strip_triple = architecture.strip_triple();
            let strip_tool = ndk_toolchain_dir.join(if is_windows {
                format!(r"{}-strip.exe", strip_triple)
            } else {
                format!(r"{}-strip", strip_triple)
            });

            scripts::run(
                &Script::new(&format!("{:?} {:?}", &strip_tool, &compiled_crust_so_path)).working_dir(&context.working_dir),
            )?;
        }

        io::copy(&compiled_crust_so_path, &compiled_libs_dir(context).join(architecture.jni_name()).join(CRUST_SO_FILE_NAME))?;
    }

    Ok(())
}

The breakdown - we start off by figuring out which directory in the Android NDK will contain the tools we will need. There is a subtle difference if you are on Windows:

let is_windows = cfg!(target_os = "windows");

let ndk_toolchain_dir = ndk_dir
    .join("toolchains")
    .join("llvm")
    .join("prebuilt")
    .join(if is_windows { "windows-x86_64" } else { "darwin-x86_64" })
    .join("bin");

Next we loop through each of the four Android architectures - we need to compile our Rust code for each one separately. We also calculate for the current architecture what its Rust and NDK triples are.

Important: The Rust triple needs to be converted to upper case and have dashes replaced with underscores in order to work as Cargo environment variables.

for architecture in &vec![
    Architecture::ARMv8A,
    Architecture::ARMv7A,
    Architecture::X86,
    Architecture::X86_64,
] {
    let rust_triple = architecture.rust_triple();
    let ndk_triple = architecture.ndk_triple();
    let cargo_rust_triple = rust_triple.to_uppercase().replace("-", "_");

Using the Rust and NDK triples we can create some environment variables to include with the compile command, which tell Cargo what linker to use, what archiver to use and what third party library files to link against. Note that we are using the compiled library directory from our SDL setup earlier which would resolve to android/.rust-build/ndk/libs - this allows the previously compiled .so files to be found during the linker stage.

let mut environment = HashMap::new();

environment.insert(
    format!("CARGO_TARGET_{}_AR", &cargo_rust_triple),
    ndk_toolchain_dir
        .join(if is_windows {
            format!(r"{}-ar.exe", ndk_triple)
        } else {
            format!(r"{}-ar", ndk_triple)
        })
        .display()
        .to_string(),
);

environment.insert(
    format!("CARGO_TARGET_{}_LINKER", &cargo_rust_triple),
    ndk_toolchain_dir
        .join(if is_windows {
            format!(r"{}30-clang.cmd", ndk_triple)
        } else {
            format!(r"{}30-clang", ndk_triple)
        })
        .display()
        .to_string(),
);

environment.insert(
    format!("CARGO_TARGET_{}_RUSTFLAGS", &cargo_rust_triple),
    format!(
        "-Clink-arg=-L{} -lc++_shared -lhidapi -lSDL2 -lSDL2_image",
        &compiled_libs_dir(context).join(architecture.jni_name()).display().to_string()
    ),
);

Then we use the set of environment variables in the shell script command to do the actual compilation for the current architecture - asking for our [lib] target to be built for the given rust_triple:

scripts::run(
    &Script::new(&format!(
        "cargo rustc {} --target-dir {:?} --lib --target {}",
        context.variant.rust_compiler_flag(),
        context.rust_build_dir,
        rust_triple,
    ))
    .environment(&environment)
    .working_dir(&context.working_dir),
)?;

Finally we will perform an additional strip operation to reduce the file size of the compiled .so files if we are doing a release build, then we copy the resulting .so file into the same place where our SDL .so files are:

let compiled_crust_so_path = context.rust_build_dir.join(rust_triple).join(context.variant.id()).join(CRUST_SO_FILE_NAME);

if context.variant == Variant::Release {
    logs::out(log_tag!(), &format!("Stripping .so library: {:?}", &compiled_crust_so_path));
    let strip_triple = architecture.strip_triple();
    let strip_tool = ndk_toolchain_dir.join(if is_windows {
        format!(r"{}-strip.exe", strip_triple)
    } else {
        format!(r"{}-strip", strip_triple)
    });

    scripts::run(
        &Script::new(&format!("{:?} {:?}", &strip_tool, &compiled_crust_so_path)).working_dir(&context.working_dir),
    )?;
}

io::copy(&compiled_crust_so_path, &compiled_libs_dir(context).join(architecture.jni_name()).join(CRUST_SO_FILE_NAME))?;

Ok, update the build method to call our compiler:

compile_rust_code(context, &ndk_dir)?;

Go back to Android Studio and run the app again - this time you will see our own crustlib library being compiled for each Android architecture and if you look inside the android/.rust-build/ndk/libs directory into each architecture you will now see libcrustlib.so in there!

+ root
    + android
        + .rust-build
            + ndk
                + libs
                    + arm64-v8a
                        ...
                        - libcrustlib.so
                    ...

Bundling libraries

Ok, we will now include all the .so files in our Android project through the special Android jniLibs directory, so they are bundled into the final APK such that our main activity can find them and start crust. Add the following method:

fn link_jni_libs(context: &Context) -> FailableUnit {
    let app_jni_libs_dir = context.target_home_dir.join("app").join("src").join("main").join("jniLibs");

    logs::out(log_tag!(), "Linking 'libs' into Android app 'jniLibs' ...");
    io::create_symlink(&compiled_libs_dir(context), &app_jni_libs_dir, &context.target_home_dir)?;

    Ok(())
}

We are creating a symlink from the android/.rust-build/ndk/libs directory to android/app/src/main/jniLibs. The jniLibs directory in an Android app is where you put native shared libraries. Although this is a symlink, Android will actually make a copy of these files during the build into the resulting APK.

Update the build method:

link_jni_libs(context)?;

Ok, run the app again and this time you shouldn’t get any error dialogs - however you also won’t see our 3D world and the app exits immediately after starting. To fix this, there is a small amount of code we need to add to our crust-main project to activate the Android entrypoint and successfully bootstrap the app.

Activate Android

For SDL based Android (and iOS) projects, we need to implement a special SDL_main method and make sure it is exposed externally from from our Rust code. SDL will look for this when it starts and will invoke it automatically if it finds it, giving us a hook to kick off our main loop and run our program. Edit crust-main/src/lib.rs and add the following method:

#[cfg(any(target_os = "android", target_os = "ios"))]
#[no_mangle]
pub extern "C" fn SDL_main(_argc: libc::c_int, _argv: *const *const libc::c_char) -> libc::c_int {
    main();
    return 0;
}

We are using the extern "C" to expose this method outside the Rust context - allowing foreign code (in our case SDL) to invoke it. We also apply the #[no_mangle] attribute to prevent anything in the Rust compilation from changing the method name or signature. Inside the method we are simply calling main() which then kicks off the whole show for us.

While we are here we will fix an issue related to device orientation on Android and iOS - edit crust-main/src/core/launcher.rs and add the following code at the top of the launch method:

pub fn launch() -> FailableUnit {
    if cfg!(target_os = "android") || cfg!(target_os = "ios") {
        sdl2::hint::set("SDL_IOS_ORIENTATIONS", "LandscapeLeft LandscapeRight");
    }

    ...

On Android in particular if we don’t do this then rotating the device into portrait orientation will cause our program to be resized into the size of the portrait screen, even though we configured the Android manifest to ignore orientation changes and be locked into landscape only.

The name of this hint is a bit misleading (SDL_IOS_ORIENTATIONS) as it isn’t limited to iOS but actually influences Android as well. If you want to run your game in portrait mode instead of landscape, adjust these values to use the appropriate hints: https://wiki.libsdl.org/SDL_HINT_ORIENTATIONS.

If you are building an app that should be locked in portrait instead, you can adjust this code to suite.

Alrighty - go back to Android Studio and run it again:

Woohoo! You can press near the edges of the scene to move around with touch controls.

Summary

Ok, nothing in Android ever seems to be easy but to be honest the work required to get Rust running well with SDL seemed simpler than with C/C++, though figuring out the right incantations along the way was certainly a mission!

In the next article we will hop back to MacOS and build the bundled MacOS Desktop target.

The code for this article can be found here.

Continue to Part 11: MacOS Desktop.

End of part 10