The Android application acts as the remote control and “mouth” for the system. It is responsible for issuing commands remotely to the Arduino and Algorithms subsystem to coordinate robot operations, and the display of messages, arena state, and images captured. The application is written in Kotlin using Android Studio.
Contents
User interface
Bluetooth connection to Raspberry Pi subsystem
The application achieves its responsibilities via a Bluetooth connection to the Raspberry Pi subsystem to communicate with the other subsystems. The connection is done via Bluetooth socket and is managed by a Bluetooth Controller static class, and may be initiated by the user via a Bluetooth settings UI similar to that built into the Android OS.
Bluetooth controller
The controller acts as the primary interface between the Bluetooth functionalities and the application activities. The controller exposes socket IO functionalities and handles three threads:
- Server listener thread (SLT)
- Client connection thread (CCT)
- Active connection thread (ACT)
Typically, a connection is initiated from the application via the CCT. The SLT is and should only be used for debugging scenarios, where the application listens for a connection. The ACT is launched automatically on a successful connection and actively listens for incoming messages on the socket and performs a callback to the active activity whenever a message is received. The ACT also provides a blocking write functionality to send messages to the other subsystems via the RPi.
Connection and message listening is done on separate threads to prevent the UI from locking up. The threads’ functionalities are adapted from the Bluetooth Android Developer Guide. Please refer to the guide for code examples.
Receiving messages from various subsystems
The ACT triggers a callback to the active activity whenever a message is received via the Bluetooth socket. The callback is implemented using higher-order functions in Kotlin. The activity is responsible for parsing the message and acting on it.
Example
// In the activity
// The higher-order callback function is passed in the curly braces
BluetoothController.startClient(device) { status, message ->
// message parsing depending on the status / message type
}
// In BluetoothController
fun startClient(device: BluetoothDevice, callback: (status: Status, message: String) -> Unit) {
this.callback = callback
BluetoothClient(device).start()
}
// In ACT
try {
inputStream.read(buffer)
BluetoothController.callback(BluetoothController.Status.READ, buffer.toString())
}
Sending messages to various subsystems
Messages may be sent to any of the other subsystems via the Bluetooth Controller. Messages have a prefix denoting the target subsystem destination.
Automatic reconnection
If a connection is terminated abruptly without user interaction, a reconnection to the last connected device is attempted automatically. This is achieved by storing the MAC address of the device when the connection was first established successfully and starting the CCT with the device parameters. The application will attempt a reconnection every 5 seconds, up to 12 times, for a total of 1 minute.
This feature is especially useful during test runs when minor changes need to be made on the RPi or when a script restart is required. Once the Raspberry Pi is ready, a connection is automatically re-established without the need for any user intervention, which saves an ample amount of time. As the application is not to be interacted with during the leaderboard challenge, this also acts as a contingency for any unexpected disconnections. Once reconnected, an arena and images update is automatically requested in case the disconnection happens at the end of the challenge.
Arena display
The arena is represented by 300 GestureImageViews contained in a GridView parent with 20 rows and 15 columns. Each view is sized equally and is relative to the display metrics of the actual device.
GestureImageView
The GestureImageView is a custom-defined extension of ImageView that contains a GestureDetector (Android SDK built-in) allowing for a series of gestures to be detected and thus allowing interactions to be made with each grid directly:
-
- Single-tap
- Double-tap
- Long press
- Fling (swipe) with velocity detection
Interactive display
Each grid on the Arena Display may be interacted with using touch gestures to perform the following actions:
Gesture | Action |
Single Tap* | Plot or remove obstacles |
Double Tap | Plot or remove waypoint |
Long Tap | Option to move start or goal point to the selected grid |
Swipe Down | Refreshes entire arena if set to manual arena update |
Swipe Left* | Undo the last action when plotting obstacles |
Swipe Right* | Redo the last action when plotting obstacles |
* Plotting of obstacles need to be enabled manually by tapping the ‘Plot’ button (Figure ?) to enter ‘Plot Mode’. This prevents unintended plots due to accidental taps.
By allowing the display to be interacted with, button space that would have been used for the above actions is now freed up to allow for a bigger Arena Display. A bigger display allows for easier interaction and display clarity, such as checking of coordinates for robot position, plotted images, etc.
The gestures are handled by overriding the appropriate gesture handler methods of the GestureDetector, namely:
-
onSingleTapConfirmed
onDoubleTap
onLongPress
onFling
(swipe motions)
Updates made to the arena are sent to the Algorithms subsystem immediately if a Bluetooth connection is active. Obstacles are updated when exiting ‘Plot Mode’ by sending an updated MDF string to the subsystem.
Synchronising with other subsystems
The arena display is updated whenever an MDF string or image string is received from the Algorithms subsystem, the former of which may be requested manually by the application. The MDF string is decoded into a binary string by converting each hex character in the string into binary individually and appending them together, as the entire MDF string is too long to be converted as a whole. The image string comprises a list of image IDs and coordinates which are plotted accordingly on the arena display.
Remote control for robot operations
Related: Sending messages to various subsystems
Manual robot operations are performed by sending commands remotely to the Arduino subsystem via the RPi. During leaderboard challenges, the application serves as the sole entry point and is responsible for sending the start signal to the Algorithms subsystem. Commands are sent when corresponding buttons are clicked on the application.
Runtime-configurable communication parameters
For reference, see the shared preferences developer guide.
As the remote control and mouth of the system, the application communicates heavily with other subsystems. However, communication parameters may change from time to time due to requirement changes, eureka moments, previous miscommunications, etc. While code-base changes may be made accordingly, the deployment process is rather time-consuming. As such, all communication parameters are saved locally as the application’s shared preferences and referenced whenever necessary and may be configured in the app at runtime.
Saving and loading of arena state
For reference, see the Room developer guide.
The current arena state displayed may be saved via MDF string to a local Room database, which may then be restored to the display subsequently. When arena states are restored, the MDF string is sent to the Algorithms subsystem for an update. This expedites test runs where a fully explored arena is required (ex. fastest path) and it allows custom-designed arenas to be saved for future references and testing.
The following parameters are saved to preserve the arena state as much as possible:
-
- Map descriptor
- Obstacle descriptor
- Start point (X, Y)
- Waypoint (X, Y)
- Goalpoint (X, Y)