Java

Overview: 
Learn about the Vuforia vision API and how to locate and identify VuMarks.
Objectives: 

Understand of how to use Vuforia to locate VuMarks.

Content: 

Vuforia is a library of classes included in the FTC SDK that supports using the RC phone camera to locate objects in the camera field of view and return actionable information to your program. That information may allow your robot to locate and navigate to a visual target, locate an object or recognize images. In this lesson we are focusing on a feature of Vuforia called VuMarks. A VuMark is an image that is like a bar code. Contained in the image is encoded information. For instance, in the Relic Revcovery game, the temple images might look identical but each has identifying information (left, right and center) encoded in hexagonal dots. Vuforia can be used to detect VuMark images when in the camera field of view and read the encoded information and return it to your program. In the Relic Recovery game, your program can use the returned information to determine the relic stacking scheme to gain the most points.

VuMarks are defined by data files created with the Vuforia Target Manager. The data files are then embedded in your program by putting the data files in the assets directory of the FtcRobotController section of the SDK project. FIRST includes VuMark files in the SDK when VuMarks are are used in a game. Because they are embedded in the finished robot controller app, your code can read these files to get the VuMark identification information needed to identify images.

There are several examples in the FTC SDK examples section, but here is a simplified example. This example makes the VuMark finding code generic, meaning it can be used for any VuMark, and puts that code in its own class. This makes the opMode itself simple, it just looks for a VuMark and when found, converts the VuMarks id information to the form (enum) used by the Relic Recovery game. The example also shows X (left/right), Y (up/down) and Z (distance) offsets of the center of the Vumark image in relation to the center of the camera field of view.

Note: To use Vuforia you will need an API Key from the Vuforia Developer web site. You can register and get a free developer key to use in your program. If you plan to use a  webcam, be sure to get the API Key for external cameras.

The code supports both phone camera and USB webcam (Control Hub) by changing which constructor you use to create an instance of the VuMarkFinder class. Here is more about using web cams including the Driver Station camera preview feature.

 

Navigation:

Overview: 
Explore the use of PID controllers to manage motors.
Objectives: 

Understand what a PID controller is and how to use one to control robot motors.

Content: 

With the test robot used to develop this course, there is a problem with the previous examples of turning under gyro or IMU control. When turning at a constant power setting and setting the power to zero when the target angle is reached, depending on motor configuration, gear ratio, robot weight and the turn power, the robot will most likely not stop quick enough to be on the desired angle. This is called overshoot. On our test robot, a 90 degree turn would end up being 110-120 degrees. Fixing this can be tricky to do manually but there is an automated way to better control the turn.

To do this we will use a software routine called a PID controller. PID stands for Proportional, Integral, Derivative. The idea behind a PID controller is to take the desired state value (90 degrees in our turn example), compare that to the actual (feedback) state value (the current gyro or IMU angle) and apply factors to the difference (called the error) that produce a proportional output value. So in our example we start out turning away from zero towards 90 degrees at the full power we specify. As the turn progresses, the angle of turn is fed to the PID controller routine which measures error and produces a value (the turn power) at or near the starting (full) power. As the turn gets closer to 90 degrees, the PID routine starts to return smaller and smaller values thus reducing the power being applied and slowing the rate of turn. In theory this reduction in power and slowing rate of turn will eliminate the overshoot. The PID controller can also apply a tolerance margin that indicates when the actual value is within some percentage of the target to further control robot motors.

A similar example is to use a PID controller to compute the power correction needed to make the robot drive in a straight line. Here the target is zero (gyro not deviating from direction we are traveling) and any change in the measured angle will result in a correction value calculated by the PID controller.

A PID controller can take a lot of tuning to get the desired result but once you get a feel for how they work you can tune them fairly quickly and by using all three of the PID factors you can get quite fine control over motors. 

There are many resources and discussions of PID online. Here, here and here are some resources to start with to investigate PID further. The FIRST forums on programming have extensive discussions of PID in robot applications.

Here is the source for a class called PIDController that will perform the PID calculations. Create a new class called PIDController and paste that code into it. Because PIDController is in the same package as the examples below, Java will be able to locate it when you reference it in the examples. PIDController is a library or utility class, a class that does nothing by itself but is used by other classes.

The example below takes the previous DriveAvoidImu example and uses two PIDController instances to manage straight driving and the 90 degree turn on obstacle contact. 

An obvious question is how did we arrive at .003 for the P value on the turn PID controller. In the case of moving from a non-zero error to zero error, we took the maximum error value, 90, and the power level we want applied at max error, .30 (30%). We divided the power by the max error ( .30 / 90 = .003 ) to determine P.

Often with just a P value alone, the robot may stall out before completing the turn because the PID controller reduces the power below the level which will move the robot. To fix this, we add some I (integral) value. The I value will compensate for P not reaching the setpoint and start adding power until the robot completes the turn. A good starting I value is P / 100. You can adjust I to get the turn completed accurately in a timely fashion. Note that these values are optimal for a 90 degree turn with 30% power. They will not work as well for other angles or power levels. You would have to compute new values for other angle/power combinations. Here is the above example modified to compute the P and I values for any angle/power combination input to the rotate() function.

In the case of driving straight, the target and error are the same at the start so the error is zero. So we have to determine how much correction we want to apply for how much error. Experimentation showed that .05 (5%) correction power for 1 degree of error worked well, correcting the error without overshooting too much (wandering): .05  / 1 = .05 for P.

As always, you can do these calculations to determine a starting P and I values and then adjust then to tune actual robot behavior.

In most simple cases only a P value is needed. I may be needed if you can't find a P value that reaches the setpoint with out overshooting. A discussion of using the D value is beyond the scope of this lesson.

 

Navigation:

Overview: 
Learn about the REV Hub's built-in IMU or Internal Measuring Unit and how to use it.
Objectives: 

Understand the REV Hub's built-in IMU and how to use it.

Content: 

The REV Expansion Hub has a built-in IMU, or Inertial Measurement Unit. This is a sensor that can measure acceleration (movement) in several axes. It can be used in place of an external gyro. The IMU is not used in quite the same way as the gyro but is similar. Note: you must configure the IMU on I2C channel 0, port 0. Here is the DriveAvoid example converted to use the IMU in place of the MR gyro.

 

Navigation:

Overview: 
Explore using the Gyroscope Sensor device to gather information about the robot's environment.
Objectives: 

Understand how to use the Gyroscope Sensor to gather information about the robot's environment.

Content: 

Modern Robotics has an gyroscopic sensor designed for use with the Tetrix control system. This sensor can return heading and rate of rotation information. Here is a detailed discussion of the Gyro sensor on the Modern Robotics website. There are links on the page to programming information. This is recommended viewing. Note that the REV hub has a built-in gyro as part of its IMU discussed in the next lesson.

If you use the MR Gyro with the REV Hub the gyro is plugged into an I2C port and configured by adding the device to the I2C Bus number matching the physical I2C port you plugged the sensor into.

Below is a simplified sample program that uses a gyro to drive in a straight line and avoid obstacles by backing up from contact with an obstacle and turing 90 degress and resuming driving in the new direction. You can paste it into AS and experiment with it.

 

Navigation:

Overview: 
Explore using the Compass Sensor device to gather information about the robot's environment.
Objectives: 

Understand how to use the Compass Sensor to gather information about the robot's environment.

Content: 

Modern Robotics has an Compass sensor designed for use with the Tetrix control system. This sensor can read the magnetic heading of the sensor, acceleration and tilt. The FTC SDK has a sample program you can use to experiment with the Optical Distance sensor. In AS, open the path FtcRobotController/java/[first package]/external.samples/SensorMRCompass. You can enable this program and work with it but any changes you make will be overwritten at the next update of the SDK. You can copy the class in the teamcode area so any changes you make will be retained.

Here is a detailed discussion of the Compass sensor on the Modern Robotics website. There is a link on the page to programming information. This is recommended viewing.

Note that with the REV Hub the Compass sensor is plugged into an I2C port and configured by adding the device to the I2C Bus number matching the physical I2C port you plugged the sensor into.

 

Navigation:

Overview: 
Explore using the Optical Distance Sensor device to gather information about the robot's environment.
Objectives: 

Understand how to use the Optical Distance Sensor to gather information about the robot's environment.

Content: 

Modern Robotics has an Optical Distance sensor designed for use with the Tetrix control system. This sensor can read the distance to a surface when the surface is within 15 centimeters of the sensor. The FTC SDK has a sample program you can use to experiment with the Optical Distance sensor. In AS, open the path FtcRobotController/java/[first package]/external.samples/SensorMROpticalDistance. You can enable this program and work with it but any changes you make will be overwritten at the next update of the SDK. You can copy the class to the teamcode area so any changes you make will be retained.

Here is a detailed discussion of the Optical Distance sensor on the Modern Robotics website. There is a link on the page to programming information. This is recommended viewing.

Note that with the REV Hub the MR Optical Distance sensor is plugged into an Analog port.

REV also has a distance sensor (combined with a color sensor) and you can find sample code at FtcRobotController/java/[first package]/external.samples/SensorREVColorDistance.

Note that with the REV Hub the color sensor is plugged into an I2C port and configured by adding the device to the I2C Bus number matching the physical I2C port you plugged the sensor into.

Navigation:

Overview: 
Explore using the Color Sensor device to gather information about the robot's environment.
Objectives: 

Understand how to use the Color Sensor to gather information about the robot's environment.

Content: 

Modern Robotics has a color sensor designed for use with the Tetrix control system. This sensor can read the color of a surface when the surface is within a few centimeters of the sensor. The FTC SDK has a sample program you can use to experiment with the color sensor. In AS, open the path FtcRobotController/java/[first package]/external.samples/SensorMRColor. You can enable this program and work with it but any changes you make will be overwritten at the next update of the SDK. You can copy the class to the teamcode area so any changes you make will be retained.

Here is a detailed discussion of the color sensor on the Modern Robotics website. There is a link on the page to programming information. This is recommended viewing.

REV also has color and color-distance sensors. The sample SensorColor works with the color only sensor and SensorREVColorDistance works with the combined sensor.

Note that with the REV Hub the color sensors are plugged into an I2C port and configured by adding the device to the I2C Bus number matching the physical I2C port you plugged the sensor into.

his example shows an interesting technique. It gains access to the user interface elements of the FtcRobotController app and uses that access to change the background color of the controller app to match the color detected by the sensor. How this is done and the many other possibilities this opens are beyond the scope of this lesson, but the ability to access and use features of the controller phone is something to be aware of.

 

Navigation:

Overview: 
Explore using the Range Sensor device to gather information about the robot's environment.
Objectives: 

Understand how to use the Range Sensor to gather information about the robot's environment.

Content: 

Modern Robotics has a range sensor designed for use with the Tetrix control system. This sensor can read the distance to a surface when the surface is between 5 and 255 centimeters of the sensor. The FTC SDK has a sample program you can use to experiment with the range sensor. In AS, open the path FtcRobotController/java/[first package]/external.samples/SensorMRRangeSensor. You can enable this program and work with it but any changes you make will be overwritten at the next update of the SDK. You can copy this class to the teamcode area so any changes you make will be retained.

Here is a detailed discussion of the range sensor on the Modern Robotics website. There is a link on that page to programming information. This is recommended viewing.

Note that with the REV Hub the MR Range sensor is plugged into an I2C port and configured by adding the device to the I2C Bus number matching the physical I2C port you plugged the sensor into.

REV has a distance sensor as well and sample code for it can be found at FtcRobotController/java/[first package]/external.samples/SensorREV2mDistance.

Note that with the REV Hub the distance sensor is plugged into an I2C port and configured by adding the device to the I2C Bus number matching the physical I2C port you plugged the sensor into.

Navigation:

Overview: 
Explore using the IR Seeker device to gather information about the robot's environment.
Objectives: 

Understand how to use the IR Seeker to gather information about the robot's environment.

Content: 

Modern Robotics has a IR beacon sensor (called IR Seeker V3) designed for use with the Tetrix control system. This sensor can detect the IR beacons used for some FTC games and provide information about the location of the beacon relative to the robot. The FTC SDK has a sample program you can use to experiment with the IR Seeker. In AS, open the path FtcRobotController/java/[first package]/external.samples/SensorMRIrSeeker. You can enable this program and work with it but any changes you make will be overwritten at the next update of the SDK. You can copy that class to the TeamCode area so any changes you make will be retained. Right click the SensorMRIrSeeker class and click copy. Then right click on the teamcode package in the TeamCode area and click paste. AS will copy the example class into the TeamCode area and adjust the package name.

Here is a discussion of the IR Seeker on the Modern Robotics website.

Note: Modern Robotics sensors or integrated motor encoders cannot be directly connected to the REV Expansion Hub. This is due to the different voltages used by the two systems. You must use a level shifter board and for sensors, a cross over cable to connect MR devices to the REV hub. Here is a video describing this issue.

Further note that with the REV Hub the IR Seeker sensor is plugged into an I2C port and configured by adding the device to the I2C Bus number matching the physical I2C port you plugged the sensor into.

 

Navigation:

Overview: 
Learn about timing issues in robotics programming.
Objectives: 

Understand the speed of program execution on a robot and what that means for your program. Understand how to write code that does not fail due to timing issues.

Content: 

When programming robots (or any other real-time system), it is very important to understand how the speed realm of the robot differs from your own. When we look at our robot program, we see the while loop that we use to perform the read-environment/make-decision/act repeating pattern. We tend to think of this loop as happening slowly or at least as we might think of going through these operations.

The reality is that the robot will execute the while loop in our program thousands of times per second. The robot operates much faster than you think. This can lead to bugs in your programs in some situations.

Consider the example code below. The purpose of the program is to perform two actions controlled by the gamepad buttons. The program will count each time your press button A and on each press of button B, toggle a boolean variable from true to false. Download the program in Android Studio and test it on your robot.

So what happened? When you pressed the A button the immediate result is the program counted hundreds or thousands of button presses. Why? Because while the A button press seemed to you to be quick, on the program's time scale the button was down for many repeats of the while loop, and it counted each loop. Same for the B button. The setting of the boolean value bButton seems erratic and definately not a toggle each time the button is pressed. Again, due to the fact that your idea of a button press and the program's idea of a button press are quite different.

What do we do about this? We have to write code to handle this situation and convert the many button presses seen by the while loop into one logical press and act only when the one logical press happends. Here is the example program modified to handle the timing problems:

Download this program into Android studio and try it out. You should see the button presses now handled correctly.

The use of the variables aButtonPressed and bButtonPressed to track the state of the button is called latching.

In the earlier lesson Using Servos, we observed that the code that moved the arm up and down was susceptible to the timing issue discussed in this lesson. Even though you momentarily pushed the button to make the robot arm move up or down, the arm would keep moving after button release and it was hard to move the arm with any precision. Here is an example of how the code that handles moving the arm in response to button presses could be rewritten to move the arm a constant fixed amount for each quick button press and move the arm a longer distance for a long button press. Try it out:

 

Navigation:

Pages