Week 5: 24th June – 30th June

Encoding the Animations

Before embarking on incorporating both apps, Wei Bin decided to try to get animations to work on an Android AR app to ensure that everyone can continue on with their work without having to wait if the animation file is compatible. Following the process from incorporating the machine learning model, he tried to search for similar apps and managed to find one done by the google ARCore development team. However, the code is extremely confusing and he has still not yet found the portion where the code pulls out the animation file and thus, has not yet been able to load our animation to see whether it can work. However, we are slowly figuring out different parts of the code through online tutorials and we should be able to get it up by next week.

Improving the Machine Learning Model

In previous attempt to produce a model, it did not perform as well as we have expected. It was unable to pick up intermediate steps when the card slot has been ejected partially. As we envisioned the app to be seamless, it was critical for the app to identify intermediate steps and guide the user along. Hence, Jian Xian saw the need to retrain the model to account for these transitions. In the image below, he had to explicitly label the partially ejected cardslot to improve the accuracy of the model.

To test if the model works, he managed to integrate the model into an Android App and it was exciting when the app managed to identify the intermediate step. This is illustrated in the screenshot below:           

Leave a Reply

Your email address will not be published. Required fields are marked *