How to Turn Your Phone Into a Remote Sprite Controller (And Other Cool Ideas)
Using Your Phone as a Sensor
Devin graduated from Middle Tennessee State University in 2020, majoring in math and computer science. Aiming for a PhD in computer science, he applied for graduate school and was accepted into Vanderbilt’s PhD program. Devin has always loved academia and wants to eventually become a professor to share knowledge of both computer science and math---and especially their interactions---with new generations. At Vanderbilt, Devin works as a research assistant, and has the opportunity to explore another area of interest: developing educational software.
Thanks for helping with Snap!Con 2021!
Many schools offer makerspaces and other opportunities for students to get their hands on simple embedded computers, sensors, and educational robots. However, most do not. The kind of sensors and devices available are limited by cost, and these kinds of activities are restricted to schools where the lab is located, making remote education difficult. But mobile devices that most students already own contain a rich collection of sensors that are connected to the internet out of the box. This presents an opportunity to teach concepts related to IoT, networking, and distributed computing that is not only accessible to novices, but also highly engaging and motivating. To make this approach a reality, we have created PhoneIoT, a mobile app for Android and iOS, which allows the built-in sensors of the device to be accessed remotely from NetsBlox, a Snap! extension. Since these devices have touchscreens as well, PhoneIoT makes it possible to configure a GUI on the phone from the same NetsBlox program that processes the sensor data and handles events from the mobile device. Hence, students can build truly distributed applications that run on two or more computers connected via the internet and that interact with the physical world via sensors.
In this workshop, participants will build two applications. The first is similar to the popular handheld non-electronic maze game, where the player tries to move a ball through a maze while avoiding holes where the ball can fall in. In the PhoneIoT version, the stage contains many hole sprites that the ball must avoid to go from the lower left corner to the top right. But the ball moves based on streaming accelerometer data from the phone. That is, depending on how the user tilts their phone, the ball on the stage running on a laptop responds accordingly. In other words, you can turn your phone into a game controller. We will provide the game template that initializes the stage with hole clones and includes the code that detects the ball falling into them. Participants will work together with the instructors to access the phone’s sensors and write the code that controls the movement of the ball to get to a functional game.
The second application is an exercise tracker. It will display a Google Maps background both on the stage and the phone’s screen showing the user’s current location and their animated track as they move around. The phone will have a start and stop button, and a text display area showing the total distance covered. Again, a template will be provided and participants will work to finish the application.
The introductory slides shown at the beginning of the presentation are available here.