top of page
Search

Week 10 - September 12-18

  • Writer: Jesus Diaz-Rivera
    Jesus Diaz-Rivera
  • Sep 16, 2021
  • 3 min read

This week began with more calibration to the Bluetooth tracking and some further testing to the app communicating with the Uno controller. Some initial readings of what signal strength can find the target distance the fastest with a certain amount of readings of RSSI per second. In short, the two variables that were adjusted each time was the signal strength that best represented the target distance, which would show up in dBm (example -60dbm), and delay period between each signal strength sample, which is in milliseconds. These were some initial readings:


Delay period between each Signal strength sample: 500ms

Target RSSI: -75dbm

Estimated RSSI when within approx. 3 feet or less: < -65dbm

RSSI at around 5 ft (~2m): between -70 and -75 dbm

Delay period between each Signal strength sample: 300ms

Target RSSI: -65dbm

Estimated RSSI when within approx. 3 feet or less: < -65dbm

RSSI at around 5 ft (~2m): between -65 and -70 dbm


RSSI seems to be more accurate when the WAIT button is pressed upon reaching the target, could be useful as a feature to have cart wait, or pause collecting RSSI and wait to be called again before following.

The second test showed this version of the CALL function has the cart a little more responsive, although there is a delay to the distance change, but as long as the cart has a somewhat direct line of sight to the user’s phone, meaning it doesn’t have to be held out directly but also no large obstructions in between, the cart can detect when it is in the target distance to the user. Again, either the user needs to be aware of this if they wish the cart to follow them. This video example showcases one of the tests for the CALL function where on the app CALL is pressed, and upon approach to the test cart, the LED's indicate the target distance was reached.


Figure 1: CALL mode test video

The beginning of building code to include the motor control began this week, as the need for not only two Arduino sketches (one for each controller) but also the app's logic required its own set of instructions for how to process when the user or autonomous control requests the cart to move in a specific direction. The way this was done in the Primary Arduino code was that each direction (front, turn left, turn right...) was to be its own VOID function called on when the code needed it, and in the case of the main code, it would output the respective set of HIGHs and LOWs to the digital pins connected to the secondary controller. The secondary controller would take these HIGH or LOW signal combinations and tell the motors which direction to turn to complete this task. Figure 2 shows a snippet of that code for secondary controller.



Figure 2: Code for directions on Secondary Controller


For more information on what each device sends to one another, a table was provided in Week 8 to model all communication between hardware.

In writing this for the motors however, since the cart does not yet have all parts in, the LED's connected on the prototype still would represent direction and distance status. The prototype for the control and mechanical module was expanded to include a third LED, being more representative of the signals that will be sent from one Arduino Uno to the other. The combination of three highs and lows will now represent what the Primary Arduino received from the user’s app; whether the user pressed the CALL function, WAIT to stop motion, or one of the manual controls, which was included this week.

Writing conditions for manual control also ended up being more complicated than originally thought, as while developing the code for the command, again the Arduino Uno showed much difficulty in getting the correct response to the values it is given. From the MIT app inventor, it properly sends any text input and the Arduino receives it, but for a while the code responded by only performing the command, say “forward” once then freezes. Only by writing the commands as an IF statement in the correct area of the code was the manual operation achieved. Better app communication with the board to include more accessible features such as holding the buttons for manual control would be greatly desired, but for the sake of time and allow more focus on the building of the cart and sensors, a simple manual control scheme is what was built. Full functionality to all buttons on the Red Rover was achieved and can be pressed in any order as needed by the user to switch between automatic and manual control.


IR transceiver inclusion into the cart will be performed and tested this weekend, along with initial designs for how the power module will be wired to the appropriate hardware.




 
 
 

Comentários


bottom of page