Tetrix

Overview: 
Explore adding logging of debugging information to a file on the controller phone and then pulling that file back to the PC for examination.
Objectives: 

Add the provided logging utility source code to your project and then understand how to use logging in your programs.

Content: 

We are now going to take a look at logging (also called tracing) as a tool to debug our robot programs. Logging is recording useful information from the robot program to a disk file on the controller device. You can then download that file to your PC and examine it. It can be very useful to record information while your robot is running during a match so you can look at it afterwards and see what took place. If you have not read the general lesson on Logging, you should do that now.

To get started, we need to get the provided logging code into your robot controller project. In the teamcode package, create a new class called Logging. Click here to open the logging class code. Copy the code and paste it into the new Logging class you just created. This adds the logging class to your project and makes it available for use in your OpModes.

Now we are going to copy the DriveCircleTouch example and add logging to it. Create a new class in the teamcode package called DriveCircleLogging. Copy the code below into that class.

Now lets discuss the changes made to implement logging. We added a constructor method to this class and in that method call Logging.Setup(). This initializes the logging system. We then write a message to the log file with the Logging.log() method. We then added other messages recording the progress of the OpMmode.

The logging class will write the messages to a file called Logging.txt in the top directory of the controller device. You can use ADB (Android Debug Bridge) to download that file from the device. Open the terminal window at the bottom of Android Studio. Copy and paste this command into the terminal window and press enter:

ZTE: adb pull //storage/sdcard0/Logging.txt c:\temp\robot_logging.txt
MOTO G: adb pull sdcard/Logging.txt c:\temp\robot_logging.txt
Control Hub: adb pull sdcard/Logging.txt c:\temp\robot_logging.txt

This will pull the file from the device into the PC directory specified. You can then view the file with Notepad.

Run the program 3 times, once letting it run the 5 seconds and stop on timeout. Then  run again and press the touch button before the 5 seconds passes. Last, run it and click the Stop button on the Driver Station before time runs out. Then pull the Logging.txt file back to your PC and take a look. It should look like this:

<0>02:43:46:360 DriveCircleLogging.<init>(DriveCircleLogging.java:25): Starting Drive Circle Logging
<1>02:43:46:377 DriveCircleLogging.runOpMode(DriveCircleLogging.java:41): waiting for start
<1>02:43:47:286 DriveCircleLogging.runOpMode(DriveCircleLogging.java:49): running
<1>02:43:53:299 DriveCircleLogging.runOpMode(DriveCircleLogging.java:68): timeout
<1>02:43:53:301 DriveCircleLogging.runOpMode(DriveCircleLogging.java:81): out of while loop
<1>02:43:53:312 DriveCircleLogging.runOpMode(DriveCircleLogging.java:91): stopFlag=true, i=3, d=3.750000
<1>02:43:53:314 DriveCircleLogging.runOpMode(DriveCircleLogging.java:93): done
<0>02:43:54:647 ========================================================================
<0>02:43:54:650 DriveCircleLogging.<init>(DriveCircleLogging.java:25): Starting Drive Circle Logging
<2>02:43:54:662 DriveCircleLogging.runOpMode(DriveCircleLogging.java:41): waiting for start
<2>02:43:55:305 DriveCircleLogging.runOpMode(DriveCircleLogging.java:49): running
<2>02:43:57:456 DriveCircleLogging.runOpMode(DriveCircleLogging.java:74): button touched
<2>02:43:57:464 DriveCircleLogging.runOpMode(DriveCircleLogging.java:81): out of while loop
<2>02:43:57:480 DriveCircleLogging.runOpMode(DriveCircleLogging.java:91): stopFlag=true, i=3, d=3.750000
<2>02:43:57:483 DriveCircleLogging.runOpMode(DriveCircleLogging.java:93): done

Note the log message that uses format specifiers to merge variable data into the log message. You can read more about formatting here.

 

Navigation:

Overview: 
Learn about the Vuforia vision API and how to locate and identify VuMarks.
Objectives: 

Understand of how to use Vuforia to locate VuMarks.

Content: 

Vuforia is a library of classes included in the FTC SDK that supports using the RC phone camera to locate objects in the camera field of view and return actionable information to your program. That information may allow your robot to locate and navigate to a visual target, locate an object or recognize images. In this lesson we are focusing on a feature of Vuforia called VuMarks. A VuMark is an image that is like a bar code. Contained in the image is encoded information. For instance, in the Relic Revcovery game, the temple images might look identical but each has identifying information (left, right and center) encoded in hexagonal dots. Vuforia can be used to detect VuMark images when in the camera field of view and read the encoded information and return it to your program. In the Relic Recovery game, your program can use the returned information to determine the relic stacking scheme to gain the most points.

VuMarks are defined by data files created with the Vuforia Target Manager. The data files are then embedded in your program by putting the data files in the assets directory of the FtcRobotController section of the SDK project. FIRST includes VuMark files in the SDK when VuMarks are are used in a game. Because they are embedded in the finished robot controller app, your code can read these files to get the VuMark identification information needed to identify images.

There are several examples in the FTC SDK examples section, but here is a simplified example. This example makes the VuMark finding code generic, meaning it can be used for any VuMark, and puts that code in its own class. This makes the opMode itself simple, it just looks for a VuMark and when found, converts the VuMarks id information to the form (enum) used by the Relic Recovery game. The example also shows X (left/right), Y (up/down) and Z (distance) offsets of the center of the Vumark image in relation to the center of the camera field of view.

Note: To use Vuforia you will need an API Key from the Vuforia Developer web site. You can register and get a free developer key to use in your program. If you plan to use a  webcam, be sure to get the API Key for external cameras.

The code supports both phone camera and USB webcam (Control Hub) by changing which constructor you use to create an instance of the VuMarkFinder class. Here is more about using web cams including the Driver Station camera preview feature.

 

Navigation:

Overview: 
Explore the use of PID controllers to manage motors.
Objectives: 

Understand what a PID controller is and how to use one to control robot motors.

Content: 

With the test robot used to develop this course, there is a problem with the previous examples of turning under gyro or IMU control. When turning at a constant power setting and setting the power to zero when the target angle is reached, depending on motor configuration, gear ratio, robot weight and the turn power, the robot will most likely not stop quick enough to be on the desired angle. This is called overshoot. On our test robot, a 90 degree turn would end up being 110-120 degrees. Fixing this can be tricky to do manually but there is an automated way to better control the turn.

To do this we will use a software routine called a PID controller. PID stands for Proportional, Integral, Derivative. The idea behind a PID controller is to take the desired state value (90 degrees in our turn example), compare that to the actual (feedback) state value (the current gyro or IMU angle) and apply factors to the difference (called the error) that produce a proportional output value. So in our example we start out turning away from zero towards 90 degrees at the full power we specify. As the turn progresses, the angle of turn is fed to the PID controller routine which measures error and produces a value (the turn power) at or near the starting (full) power. As the turn gets closer to 90 degrees, the PID routine starts to return smaller and smaller values thus reducing the power being applied and slowing the rate of turn. In theory this reduction in power and slowing rate of turn will eliminate the overshoot. The PID controller can also apply a tolerance margin that indicates when the actual value is within some percentage of the target to further control robot motors.

A similar example is to use a PID controller to compute the power correction needed to make the robot drive in a straight line. Here the target is zero (gyro not deviating from direction we are traveling) and any change in the measured angle will result in a correction value calculated by the PID controller.

A PID controller can take a lot of tuning to get the desired result but once you get a feel for how they work you can tune them fairly quickly and by using all three of the PID factors you can get quite fine control over motors. 

There are many resources and discussions of PID online. Here, here and here are some resources to start with to investigate PID further. The FIRST forums on programming have extensive discussions of PID in robot applications.

Here is the source for a class called PIDController that will perform the PID calculations. Create a new class called PIDController and paste that code into it. Because PIDController is in the same package as the examples below, Java will be able to locate it when you reference it in the examples. PIDController is a library or utility class, a class that does nothing by itself but is used by other classes.

The example below takes the previous DriveAvoidImu example and uses two PIDController instances to manage straight driving and the 90 degree turn on obstacle contact. 

An obvious question is how did we arrive at .003 for the P value on the turn PID controller. In the case of moving from a non-zero error to zero error, we took the maximum error value, 90, and the power level we want applied at max error, .30 (30%). We divided the power by the max error ( .30 / 90 = .003 ) to determine P.

Often with just a P value alone, the robot may stall out before completing the turn because the PID controller reduces the power below the level which will move the robot. To fix this, we add some I (integral) value. The I value will compensate for P not reaching the setpoint and start adding power until the robot completes the turn. A good starting I value is P / 100. You can adjust I to get the turn completed accurately in a timely fashion. Note that these values are optimal for a 90 degree turn with 30% power. They will not work as well for other angles or power levels. You would have to compute new values for other angle/power combinations. Here is the above example modified to compute the P and I values for any angle/power combination input to the rotate() function.

In the case of driving straight, the target and error are the same at the start so the error is zero. So we have to determine how much correction we want to apply for how much error. Experimentation showed that .05 (5%) correction power for 1 degree of error worked well, correcting the error without overshooting too much (wandering): .05  / 1 = .05 for P.

As always, you can do these calculations to determine a starting P and I values and then adjust then to tune actual robot behavior.

In most simple cases only a P value is needed. I may be needed if you can't find a P value that reaches the setpoint with out overshooting. A discussion of using the D value is beyond the scope of this lesson.

 

Navigation:

Overview: 
Learn about the REV Hub's built-in IMU or Internal Measuring Unit and how to use it.
Objectives: 

Understand the REV Hub's built-in IMU and how to use it.

Content: 

The REV Expansion Hub has a built-in IMU, or Inertial Measurement Unit. This is a sensor that can measure acceleration (movement) in several axes. It can be used in place of an external gyro. The IMU is not used in quite the same way as the gyro but is similar. Note: you must configure the IMU on I2C channel 0, port 0. Here is the DriveAvoid example converted to use the IMU in place of the MR gyro.

 

Navigation:

Overview: 
Explore using the Gyroscope Sensor device to gather information about the robot's environment.
Objectives: 

Understand how to use the Gyroscope Sensor to gather information about the robot's environment.

Content: 

Modern Robotics has an gyroscopic sensor designed for use with the Tetrix control system. This sensor can return heading and rate of rotation information. Here is a detailed discussion of the Gyro sensor on the Modern Robotics website. There are links on the page to programming information. This is recommended viewing. Note that the REV hub has a built-in gyro as part of its IMU discussed in the next lesson.

If you use the MR Gyro with the REV Hub the gyro is plugged into an I2C port and configured by adding the device to the I2C Bus number matching the physical I2C port you plugged the sensor into.

Below is a simplified sample program that uses a gyro to drive in a straight line and avoid obstacles by backing up from contact with an obstacle and turing 90 degress and resuming driving in the new direction. You can paste it into AS and experiment with it.

 

Navigation:

Overview: 
Explore using the Compass Sensor device to gather information about the robot's environment.
Objectives: 

Understand how to use the Compass Sensor to gather information about the robot's environment.

Content: 

Modern Robotics has an Compass sensor designed for use with the Tetrix control system. This sensor can read the magnetic heading of the sensor, acceleration and tilt. The FTC SDK has a sample program you can use to experiment with the Optical Distance sensor. In AS, open the path FtcRobotController/java/[first package]/external.samples/SensorMRCompass. You can enable this program and work with it but any changes you make will be overwritten at the next update of the SDK. You can copy the class in the teamcode area so any changes you make will be retained.

Here is a detailed discussion of the Compass sensor on the Modern Robotics website. There is a link on the page to programming information. This is recommended viewing.

Note that with the REV Hub the Compass sensor is plugged into an I2C port and configured by adding the device to the I2C Bus number matching the physical I2C port you plugged the sensor into.

 

Navigation:

Overview: 
Explore using the Optical Distance Sensor device to gather information about the robot's environment.
Objectives: 

Understand how to use the Optical Distance Sensor to gather information about the robot's environment.

Content: 

Modern Robotics has an Optical Distance sensor designed for use with the Tetrix control system. This sensor can read the distance to a surface when the surface is within 15 centimeters of the sensor. The FTC SDK has a sample program you can use to experiment with the Optical Distance sensor. In AS, open the path FtcRobotController/java/[first package]/external.samples/SensorMROpticalDistance. You can enable this program and work with it but any changes you make will be overwritten at the next update of the SDK. You can copy the class to the teamcode area so any changes you make will be retained.

Here is a detailed discussion of the Optical Distance sensor on the Modern Robotics website. There is a link on the page to programming information. This is recommended viewing.

Note that with the REV Hub the MR Optical Distance sensor is plugged into an Analog port.

REV also has a distance sensor (combined with a color sensor) and you can find sample code at FtcRobotController/java/[first package]/external.samples/SensorREVColorDistance.

Note that with the REV Hub the color sensor is plugged into an I2C port and configured by adding the device to the I2C Bus number matching the physical I2C port you plugged the sensor into.

Navigation:

Overview: 
Explore using the Color Sensor device to gather information about the robot's environment.
Objectives: 

Understand how to use the Color Sensor to gather information about the robot's environment.

Content: 

Modern Robotics has a color sensor designed for use with the Tetrix control system. This sensor can read the color of a surface when the surface is within a few centimeters of the sensor. The FTC SDK has a sample program you can use to experiment with the color sensor. In AS, open the path FtcRobotController/java/[first package]/external.samples/SensorMRColor. You can enable this program and work with it but any changes you make will be overwritten at the next update of the SDK. You can copy the class to the teamcode area so any changes you make will be retained.

Here is a detailed discussion of the color sensor on the Modern Robotics website. There is a link on the page to programming information. This is recommended viewing.

REV also has color and color-distance sensors. The sample SensorColor works with the color only sensor and SensorREVColorDistance works with the combined sensor.

Note that with the REV Hub the color sensors are plugged into an I2C port and configured by adding the device to the I2C Bus number matching the physical I2C port you plugged the sensor into.

his example shows an interesting technique. It gains access to the user interface elements of the FtcRobotController app and uses that access to change the background color of the controller app to match the color detected by the sensor. How this is done and the many other possibilities this opens are beyond the scope of this lesson, but the ability to access and use features of the controller phone is something to be aware of.

 

Navigation:

Overview: 
Explore using the Range Sensor device to gather information about the robot's environment.
Objectives: 

Understand how to use the Range Sensor to gather information about the robot's environment.

Content: 

Modern Robotics has a range sensor designed for use with the Tetrix control system. This sensor can read the distance to a surface when the surface is between 5 and 255 centimeters of the sensor. The FTC SDK has a sample program you can use to experiment with the range sensor. In AS, open the path FtcRobotController/java/[first package]/external.samples/SensorMRRangeSensor. You can enable this program and work with it but any changes you make will be overwritten at the next update of the SDK. You can copy this class to the teamcode area so any changes you make will be retained.

Here is a detailed discussion of the range sensor on the Modern Robotics website. There is a link on that page to programming information. This is recommended viewing.

Note that with the REV Hub the MR Range sensor is plugged into an I2C port and configured by adding the device to the I2C Bus number matching the physical I2C port you plugged the sensor into.

REV has a distance sensor as well and sample code for it can be found at FtcRobotController/java/[first package]/external.samples/SensorREV2mDistance.

Note that with the REV Hub the distance sensor is plugged into an I2C port and configured by adding the device to the I2C Bus number matching the physical I2C port you plugged the sensor into.

Navigation:

Overview: 
Explore using the IR Seeker device to gather information about the robot's environment.
Objectives: 

Understand how to use the IR Seeker to gather information about the robot's environment.

Content: 

Modern Robotics has a IR beacon sensor (called IR Seeker V3) designed for use with the Tetrix control system. This sensor can detect the IR beacons used for some FTC games and provide information about the location of the beacon relative to the robot. The FTC SDK has a sample program you can use to experiment with the IR Seeker. In AS, open the path FtcRobotController/java/[first package]/external.samples/SensorMRIrSeeker. You can enable this program and work with it but any changes you make will be overwritten at the next update of the SDK. You can copy that class to the TeamCode area so any changes you make will be retained. Right click the SensorMRIrSeeker class and click copy. Then right click on the teamcode package in the TeamCode area and click paste. AS will copy the example class into the TeamCode area and adjust the package name.

Here is a discussion of the IR Seeker on the Modern Robotics website.

Note: Modern Robotics sensors or integrated motor encoders cannot be directly connected to the REV Expansion Hub. This is due to the different voltages used by the two systems. You must use a level shifter board and for sensors, a cross over cable to connect MR devices to the REV hub. Here is a video describing this issue.

Further note that with the REV Hub the IR Seeker sensor is plugged into an I2C port and configured by adding the device to the I2C Bus number matching the physical I2C port you plugged the sensor into.

 

Navigation:

Pages