Robotics

Overview: 
Explore how physical robot hardware devices are mapped to names that can be used in programs to access those devices.
Objectives: 

Understand robot hardware mapping including controller phone configuration files and how to access hardware devices in software via the mapping scheme.

Content: 

A key function of the FTC SDK, and of any robotics API, is to provide access to the physical hardware of a robot to the software. A way must be provided to allow programmers to identify, in software, hardware devices on the robot so that they can write programs that interact with that hardware. On the Tetrix/FTC platform, this is called hardware mapping.

Hardware mapping consists of two parts, hardware configuration on the controller phone and hardware mapping in your OpMode class.

When the controller phone is attached to the robot's Core Power Distribution Module, the various control modules and devices plugged into the modules should be recognized by the phone. This is called scanning, and it is performed each time the phone is connected (or the robot is powered on). This set of hardware information is called a configuration. You may have more that one configuration stored on the phone. You access the configuration by clicking the three vertical dots in the upper right corner of the controller app main screen. On the menu, select Settings and then Configure Robot. If you already have one or more configuration files, they will be listed. If you have no configuration files, you will shown a list of the hardware controllers recognized by the controller app.

We will get into the details of the hardware configuration in a moment. Once a configuration has been created, when you are done you click Save Configuration at the bottom of the controller hardware list. This will save the configuration into a file, which you will be prompted to assign a name. After that you will see the list of available configuration files. For each file, you can Edit (change), Activate or Delete. Click Activate to make the selected configuration the current active configuration, which will be displayed in the title bars of each screen. Then use the back button to return to the main screen. The controller will make sure it can reach each controller module in the configuration and if there are no problems, the main screen should display Robot Status as running and Op Mode as Stop Robot, with no error messages below that. The controller is now ready to run the robot.

When editing a hardware configuration you access each controller module and for each hardware device recognized by that module, assign a unique name by which you will access that device in your programs. You can also assign more meaningful names to the controller modules themselves though this is generally not needed. 

When you click on a motor or servo controller, you will see a list of the ports the controller has. You will have plugged motors or servos into these ports when constructing your robot. On the list, check the attached box next to a port if you have attached a device to that port. Then assign a meaningful name to that device. For instance, if you have the motor on the left side of your robot plugged into port 1, you could assign the name left_motor to port 1. This name is what you will use in your OpMode to control that motor. So configuration is all about telling your software which hardware devices (motors, servos, sensors) are on your robot, which port they are plugged into and assigning the device a name by which it will be known in your program.

Here is a lesson that describes the configuation process in more detail.

New for 2017-18 season is the Rev Robotics Expansion Hub. This device replaces the 3 controller modules of the Modern Robotics control scheme. You plug all your motors and sensors into the Expansion Hub(s) (Hubs can be daisy-chained). The process of creating the hardware configuration file on the controller phone is very much the same as with the Modern Robotics modules, but there is just one control module, the Hub. Here is a detailed discussion of creating the hardware configuration file for the Expansion Hub.

Once you have a hardware configuration file defined and active on your controller phone, you can proceed to the software side of hardware mapping.

In order to control your robot, you will need to create objects that control each of your hardware devices and connect the objects to the actual hardware devices. This is done through the hardwareMap static classes. For example, lets say our robot has two DC motors, named left_motor and right_motor in the phone configuration and we want to control them in code:

Here we create two DCMotor type reference variables and then use the hardwareMap.dcMotor static class and call its method get with the names we assigned in our hardware configuration file. The get method creates a DCMotor object and maps it to the appropriate motor controller port and returns a reference to the object into leftMotor or rightMotor reference variables. Now we can control those motors using the various methods available on the DCMotor class like setPower(), which sets the power level of the motor.

This code appears in your OpMode class. The motor definitions typically are at the top of your class. The hardware mapping should occur in your initialization section, either in the init_loop() function of a regular OpMode or before the waitForStart() call in a linear OpMode. The setPower() calls would appear in your loop() method for a regular OpMode or after waitForStart(), to control actual robot movement.

There is a class for every hardware device and the hardwareMap package has subclasses to map every device. You will need to review the FTC API documentation to become familiar with the device classes available and the fields and methods each class has.

In this manner we map all of a robots hardware devices to object instances of the appropriate class. We then use the object instances to interact with the devices.

Note that the OpMode classes (that you extend) provide built-in access to the Xbox controllers through the variables gamepad1 and gamepad2. This means you don't have to do the hardware mapping for the controllers.

 

Navigation:

Overview: 
Explore the purpose and content of the FTC SDK Library.
Objectives: 

Understand what the FTC SDK Library is and how to use it to interact with robot hardware.

Content: 

The FTC SDK Library is a library of classes that allow your programs to access and control all aspects of the Tetrix robot control system and the hardware devices attached to it. This library is the API for the control system and robot hardware. The library is included in the FTC SDK. You can access the library with the following import statement in an OpMode class:

com.qualcomm.robotcore.<item_path>;

This provides access to the highest level of the library and all of the hardware and software classes are divided up into lower level items. you will need to import the items you need in your OpModes. All OpModes need the following import to make  the base opmodes class available for your OpMode to extend:

com.qualcomm.robotcore.eventloop.opmode.OpMode

or

com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;

Another import you will always need is for the robot hardware classes:

com.qualcomm.robotcore.hardware.*;

Remember that the trailing * imports all classes in the hardware item. You can import all of the hardware classes or just the specific classes you intent to use. Either way is valid.

The documentation for the FTC SDK Library is very important to read over and get a basic understanding of what classes are available for your use. The documentation is located in the FTC SDK install directory (ftc_app-master-n.n) in the sub directory doc.javadoc. Click on the file index.html to display the documentation in your browser. The doc is in web format so you must use a browser to view it. You should create a bookmark the index.html file. Don't forget to update this bookmark when installing new versions of the SDK.

The doc directory also contains a sub directory called tutorial. This directory used to contain useful documents describing various aspects of the robot control system. These items are now located online in github attached to each release of the FTC SDK.

Of particular interest is FTC_SDK_UserMan. This document is an extensive description of the Tetrix/FTC robot control system and the software programming environment. While it overlaps our lessons, it provides a lot of detail and is a great supplement to our lessons. It is highly recommended that you at least skim this manual. Do note that the document may be out of date in some areas but is still quite helpful. An online version of the javadoc is also associated with each release.

Don't forget the example code that is included in the SDK in the FtcRobotController\java folder of the SDK project.

 

Navigation:

Overview: 
Examine the details of using Android Studio to create OpMode classes.
Objectives: 

Understand how to use Android Studio to create OpMode classes, compile them and download them to the robot controller phone.

Content: 

Now its time to get more familiar with Android Studio and writing OpModes.

Watch this video on writing OpModes.

Remember, each time we change an OpMode's source code, we must recompile and download the newly updated robot controller app to the controller phone. Compiling in Android Studio will take of the download as long as your PC is connected to the controller phone with a USB cable or via WiFi.

The first time you connect your controller phone to your PC with a USB cable, the phone should install the USB driver needed for AS to communicate with your phone. If this driver install is unsuccessful and the phone is the ZTE, disconnect and reconnect the phone. When the dialog opens asking what you want to do with the USB device, select the AutoRun option. This should run the ZTE USB driver installer. Once install is complete reconnect your phone. The phone should now be visible to AS. If it is still not working, go to phone settings, Connect to PC, and make sure the Media Device and Enable USB Debugging options are selected. Reconnect.

Also the first time you connect the controller phone to your PC, the phone will prompt you accept the RSA Security Key presented by your PC to the phone. Set the option to always accept the RSA Key from this PC and click Ok.

 

Navigation:

Overview: 
Explore the details of the linear OpMode model.
Objectives: 

Understand how linear OpModes work and how to use them.

Content: 
The Linear OpMode is much simpler than the regular OpMode. You extend LinearOpMode and there is only one method to override: runOpMode(). This method is called after the Init button is pressed on the DS. So how do you separate initialization from actually starting the program running? You use the waitForStart() method inherited from the base LinearOpMode class.
 
So from the start of your code in runOpMode() to the waitForStart() method call, you would place all of your initialization activity. When the Init button is pressed, all of your code up to the waitForStart() is executed and then your program waits for the Start button to be pressed. After that your program is in control until the end of the game period. You will most likely need a loop of some sort where you monitor the controllers and take action, but in the autonomous mode you may just execute a set of sequential instructions. In any case, your program must end at the appropriate time. When looping, you can monitor the OpModeIsActive() function (part of the base LinearOpMode class) to determine when you should stop your program.
 
Lets modify the NullOp OpMode sample to be of the linear form:

We use a while loop to continue executing as long as the OpMode is active (opModeActive() returns true).

Note that we call telemetry.update() after the addData() calls. The update() method actually sends the data added with any preceeding addData() calls to the Driver Station. This is different than the regular OpMode.

We also need to call the idle() method at the end of any looping we do to share the phone's processor with other processes on the phone.

 

Navigation:

Overview: 
Explore the details of regular model OpModes.
Objectives: 

Understand how regular OpModes function, what calls are made when and how to use a regular OpMode.

Content: 

The regular OpMode type that you would write extends the OpMode class in the FTC SDK. To add functionality, your code will override one or more of the following methods that exist in the base OpMode class:

  • init()
  • init_loop()
  • start()
  • stop()
  • loop()

These methods are called at the appropriate time in response to button presses on the driver station (DS).

The init() method is called when you select an OpMode from the list of OpModes on the DS. This is called one time, when you do the selection. It is used for very basic initialization functions your code may need to perform. It is not required.

The init_loop() method is called when you press the Init button on the DS. This method is called each time you press Init, so it will be called each time to run the same OpMode. This is the main place to perform your initialization activities. It is not required.

The start() method is called when the Start button is pressed on the DS. You can perform initialization activities more closely related to the start of program execution here. It is not required.

Note that neither init(), init_loop() or start() is required but one of them must be present to hold your code that sets up your hardware map. You should not set up your hardware map in the loop() method. The init_loop() method is recommended.

The stop() method is called when the Stop button is pressed on the DS. You can perform any activities required to stop your program. It is not required.

The loop() method is the workhorse of the regular OpMode. Once Start is pressed on the DS, the loop() method will be called repeatedly until the Stop button is pressed. In this method you should do your robot control activities. The key idea is that you determine the current state of the robot or control inputs, respond as needed and then exit the method. The loop() method  should be kept short.

Lets look at the NullOp sample OpMode from the FTC SDK:  

This simple example shows the key components of the regular OpMode style of program. You can see this example in the FTC SDK source code and compile it to the controller phone to see it work.

Note that the base OpMode class has useful fields and methods you can access by typing their names. A trick to see what is available is to type this. and wait. AS will show you the available items. You can also look at the documentation for the OpMode class. The field telemetry is an example. This is an object reference field on the base OpMode class that points to an object that allows you to send data to the driver station app to be displayed below the Start button. To use telemetry, you call the addData function with two strings. The first is a label or title for the data you want to display. The second string is the data to be displayed. You can call addData multiple times. The telemetry class has many capabilities to display data on the driver station. It is worthwhile to read the documentation for this class.

Note that the telemetry data sent to the DS is not remembered (by default) between calls to loop() or any of the other methods. You must add everything you want displayed on each loop call. This behavior can be changed if you wish.

The class ElapsedTime is a utility class in the FTC SDK provided for you to track OpMode run time. You can create an instance of ElapsedTime (here called runtime) and use it to track the time the OpMode has been  running. This is done in this example so you can see that the OpMode is running and doing something. Note that the OpMode class provides a field called time and a method called getRuntime() that both provide the same information. Either way is valid.

We also use the built-in Java classes Date and SimpleDateFormat to create a formatted string containing the nicely formatted date and time of  OpMode start as an example of using Java classes and displaying information about what is happening in the OpMode.

 

Navigation:

Overview: 
Explore what OpModes are, how they work and how to get started creating your own OpModes.
Objectives: 

Understand what OpModes are, the difference between looping and linear OpModes, how we create and use OpModes.

Content: 

The term OpMode or Operational Mode (also Op Mode and opmode) refers to a class located within the FTC SDK (robot controller app source code). You create this class to add your code to the controller app. Your code is really just a part of the controller app, with the rest of the app supplied by the FTC source code. We don't modify that other part of the code, we just create the custom robot behavior we want by adding our own OpModes. Here is a quick video overview of OpModes.

So how do we do this? We create a new class and extend the FTC provided OpMode class. In essence, we add functionality to the base application by adding new OpModes to it. Each "program" we write for our robot is a class that “extends” the OpMode class. A class that is an extension of another class is a descendant or sub-class: it has (inherits) the properties and methods of the original, but those can be changed or added to. We discuss extending classes in this lesson. When a robot is being operated, the driver station is used to select an OpMode and the controller phone runs only that OpMode.

A quick refresher on robot coding. All robot programs are essentially a looping activity. Your code repeatedly runs in a loop obtaining input, acting on that input and doing it again. 

OpModes are of two types, regular and linear. In a regular OpMode, the predefined method loop() is called repeatedly during robot operation. You write code to respond to these calls to loop(). The key idea is that you do not write a "loop" in your code, the base OpMode provides that for you by calling the loop method repeatedly on a timed basis. This is similar to an event based programming model. Your code responds to the "loop" event. This model is somewhat more difficult for beginners to use.

The linear OpMode is a traditional sequential execution model. Your code is started by the base OpMode and runs on its own until execution is over. In this model you must provide the loop in your code. This model is simpler to use and understand. Note that either model is valid and the choice of OpMode is up to the programmer, however the lessons in this Unit will focus on the linear OpMode.

In either case, when you add a new OpMode (a class file) you need to tell the base controller app that you have done so. You do this by using a Java special statement called an Annotation. An Annotation is an instruction to the Java compiler and is used by the FTC SDK to register your OpMode. The Annotation is placed in your code just above the OpMode class name and contains a title for your OpMode and classifies the OpMode as autonomous or teleop. You can further place your OpModes into groups of your choosing. You can temporarily remove an OpMode by adding another Annotation which disables the OpMode. We will show exactly how this is done in the program examples we will be looking at shortly. This registration process is what makes your OpMode visible to the robot controller phone and available to run.

Either type of OpMode can be used to program the two modes of robot control program execution, autonomous and teleop. In autonomous, the robot is not under the control of humans and as such will receive no input from the driver station phone. This mode is timed and is typically 30 seconds long. A timer on your driver station phone can be used to control the autonomous period, stopping your robot for you or you can manually stop the robot by pressing Stop (square black button) on the driver station. Your code will have to make all decisions and handle everything the robot is designed to do in the autonomous period.

In teleop mode, the robot is operated with input from humans via the Xbox controllers attached to the driver station. This mode is typically 2 minutes long and is stopped manually (Stop button) under direction of the match referee. In this mode your code will monitor Xbox controller input and translate that input into robot actions.

Both modes are started by pressing the Init button and then the Start (arrow) button on the driver station when directed by the referee. The Init button is used to tell your (teleop only) code it should perform whatever initialization functions you have programmed and the Start button begins program execution. We will explore these modes of execution and the driver station controls in more detail shortly.

 

Navigation:

Overview: 
Explore the procedures to install the software tools needed to develop Java programs for Tetrix robots.
Objectives: 

Complete the installation of all of the software tools needed to program Tetrix robots with Java.

Content: 

This course will not delve into Tetrix hardware details or discuss how to build the physical robot. It is assumed you will learn about these topics elsewhere. However, here is a refresher (watch first 4:30) on the Tetrix hardware environment. Here is a diagram of the Tetrix system hardware components and a diagram of the basic wiring of the control system.

As we discussed earlier, the Tetrix control system consists of two cell phones, a robot controller phone and a driver station phone. You download the driver station phone application (app) and a test robot controller app from Google Play. Search Play for "FTC Robot". You do not modify the driver station app. You can also download a demo version of the controller app onto the controller phone. This allows you to get some familiarity with the two apps and how the controller phone is configured to know about the specific hardware devices (motors, sensors) that are part of your robot. Here is a lesson (watch from 14:00 to 28:12) on these two apps showing how to operate them.

The source code for the robot controller app is available to you to download and install into Android Studio (AS). This is now you create your own robot control programs, by modifying this FIRST supplied controller app. This source code is called the FTC SDK.

The procedures for installing the software tools you need are discussed in this lesson. Links to the components discussed in the video are below.

Here is a link to download the Java runtime.

Here is a link to download the Java SDK. Download the i586 file for 32-bit Windows, x64 for 64-bit Windows. When installing, you only need the Development Tools, you can X out Source Code and Public JRE.

Here is a link to download the FTC SDK on github. On the github page for the SDK, click the green Clone or Download button. Then click download zip file. As a suggestion, create a folder called FTC Java in your My Documents folder and extract the FTC SDK (ftc_app-master folder) from within the zip file into the FTC Java folder. Then rename that folder to include the FTC SDK version, ie: ftc_app-master-3.4. This will allow you to keep older versions of the SDK and safely install new versions. You should not overlay an existing version with a newer version. You can find the version by scrolling down on the github page to the Release Information section. The version of the SDK will be shown there. If you install a newer version of the SDK, locate the Team Code folder inside the old ftc_app-master-n.n folder with Windows Explorer and copy your code to the Team Code folder in the new version folder.

Here is a link to download Android Studio.

As shown in the video, at the end of the AS install process,  you will be prompted to tell AS what project to start with. You will want to select Import Project (Gradle) and give the folder where you installed your FTC SDK (ftc_app-master-n.n) project. At this point AS will use Gradle to import and analyze your project.

Gradle is the name of the tool used with AS to compile and deploy the robot controller app. On first import of the controller project, Gradle will scan the project and determine what Android components are needed and flag any missing components as errors along with a link you can double click to install the missing item. This scan can take a long time. There can be a number of install operations flagged during the initial Gradle sync. Some of these operations can take a long time to complete. Be patient and complete each flagged install.

After Gradle processing completes, AS will show a blank editing area on the right and the project navigation window on the left. AS may also show a single blank editing area. On the vertical bar left of the navigation window or editing area, select Project. Then on the tabs at the top of the project window, in the view drop down list, select Android. This will give you the simplest view of the project. You should see two main folders, FtcRobotController and TeamCode. FtcRobotController contains the low level FIRST provided components of the robot controller app. You will not need to modify any part of this code. However, the FIRST provided example code is located here. You can open the folders to java then org.firstinspires.ftc.robotcontroller (a package) then folder external.samples to see the example code. This example code is a very valuable resource to learn how to program many robot functions and use various sensor devices once you have completed this course.

The TeamCode folder is where you will put all of your source code. Open that folder and then java and then org.firstinspires.ftc.teamcode which is the folder (also the package) where your code will be.

One final installation step: locate the platform-tools folder in the Android SDK folder, which by default is located at C:\Users\<yourusername>\appdata\local\Android\sdk. From platform-tools copy the files adb.exe, AdbWinApi.dll and AdbWinUsbApi.dll to the C:\Windows folder.

Note: If you are in a class room or other situation where multiple users, with different Windows user names, will share a single PC to work on this curriculum, please see these instructions on Android Studio shared by multiple users on the same PC.

We will learn more about how to use Android Studio in a later lesson.

A great resource to use while working with the FTC platform is the FTC Technology Forum.

Finally, here is a lesson package by Modern Robotics that explores the hardware components in great detail. You don't need to explore this now but you may wish to look at this material later to gain much more detailed information about the hardware components, how they work and what you can do with them. When you visit the Modern Robotics Education web site, you will be prompted to login. Click on guest access below the login boxes to access the site without registering.

Here is a documentation package that discusses using the new REV Robotics Expansion Hub controller instead of the Modern Robotics controllers.

 

 

Navigation:

Overview: 
Introduction to Java programming for the Tetrix platform.
Objectives: 

Understand the main concepts of the Tetrix robot control system hardware and software.

Content: 

This lesson is the first in the "off ramp" Unit for Tetrix programmers. This Unit contains a detailed exploration of writing Java programs for the Tetrix control system. Don't forget to complete the rest of the Java curriculum starting with Unit 12.

We have been learning a lot about the Java programming language. Now its time to explore how we actually write, compile and deploy Java programs for the Tetrix (FTC) robotics control system.

Tetrix based robots use a far more complex control system than the EV3 (FLL) based robots. At the FTC level robots engage in autonomous activity, meaning the robot is not under the control of a human, just like EV3 robots. However, autonomous activity is a relatively small part of the match that is played in competition. The larger portion of match time is teleoperated activity, where the robot is under remote control by human operators. As such, the control system consists of two hardware devices, a robot controller device and a driver station device. The two devices are connected (paired) over a WiFi Direct network. With the Tetrix system, the two devices are Android based cell phones.

The driver station cell phone is fairly straight forward. The software for the driver station is provided by FIRST and is not modified by you. Xbox game controllers plug into the driver station phone and are the input devices for robot control.

The controller cell phone is more complex. This phone is attached to the robot and interfaces with controller hardware that allows the phone to connect to the various robot hardware devices like motors and sensors. You write the software that runs on the controller phone and operates the robot with input from the driver station phone's game controllers.

You can write programs for Tetrix robots with block based programming tools or with Java (discussion). This curriculum only deals with Java. Java programs can be developed on a Windows PC using the Android Studio IDE or directly on the controller phone with OnBot Java. OnBot Java allows you to write Java programs by using a web browser to connect to a Java development tool hosted on the controller phone. This curriculum is focused on using Android Studio to write robot control programs and will not discuss OnBot Java. However, the Java exercises in this curriculum will work if pasted into OnBot Java. You can learn about OnBot Java here.

The software tools we will be using to write Tetrix robot control programs are:

  • Driver Station phone program (phone)
  • Java SDK (PC)
  • Interactive Develpment Environment (PC)
  • Plugins for the Interactive Development Environment (PC)
  • Android Development Kit (PC)
  • Control program SDK from FIRST for FTC (Tetrix) (PC)

We will discuss each of these tools and how to install them in detail in the following lesson.

The Driver Station phone software is provided by FIRST and downloaded from Google Play.

The Java SDK is required on your development PC to be able to compile Java programs.

An Interactive Development Environment (IDE) is a tool that makes it easy to create, compile and deploy programs to devices. Because the robot controller  is an Android cell phone, the control program is actually a phone application. The IDE we will be using is Android Studio. Android Studio (AS) is similar to Eclipse or Visual Studio but is optimized for creating phone applications. There are plugins to AS supplied by FIRST that customize AS for use in developing Tetrix control programs.

The final piece is the FTC SDK provided by FIRST. Since the robot control program is an Android phone application, FIRST has provided a base phone application which handles the details of phone applications and includes the libraries (API) needed to access robot hardware and communicate with the Driver Station phone. The design of this base application allows you to modify the application by simply adding your own classes (called OpModes) to the base application. The base application hides the details of Android phone applications so you can focus on programming your robot. The base phone application does not do any robot control, that is the resposibility of the classes you add. This base phone application is delivered to you as an Android Studio project that generates the phone application. This project, for the base phone application is referred to as the FTC SDK.

 

Navigation:

Overview: 
Explore using two sensors, UltraSonic and Gryo, to detect and avoid obstacles while driving.
Objectives: 

Understand how to use sensors to program your robot to avoid obstacles while driving.

Content: 

We have looked at test programs for several sensors. Now lets use two sensors to create a practical example of using sensors. We will take the simple driving sample and use an UltraSonic sensor to detect obstacles in the path of the robot and then use the Gyro sensor to execute a 90 degree turn to avoid an obstacle and continue driving. A Touch sensor is used as a way to stop the program along with the escape key on the EV3.

Create a new package called ev3.exercises.driveAvoid. Create a new class in that package called DriveAvoid and copy the following code into that class:

This program will drive the robot and if it detects an obstacle in its path it will make a 90 degree right turn and continue driving. It will drive until the escape key or touch sensor is pressed.

 

Navigation:

Overview: 
Explore how to use the Color sensor.
Objectives: 

Learn how to use the Color sensor.

Content: 

The Color Sensor is used to determine the amount of light reflected from a surface and also the color of the reflected light. The Color Sensor is typically used in line following applications where the surface is the table the robot is operating on. The Color Sensor must be close to the surface, usually about 1 cm to work well. The sensor has a multi-color LED (called the floodlight) that can be used to illuminate the surface. The Color sensor is more complicated than the other sensors. It has several modes of operation:

Mode Description
ColorID Returns a numeric value that maps to a single color. Values can be found in the lejos.robotics.Color class. Only recognizes basic colors.
Red Returns light level (brightness) of Red light. Red floodlight LED should be turned on. Red light offers better detection of light levels.
RGB Returns a  lejos.robotics.Color object with the Red, Green and Blue values set according to the brightness (intensity) of those colors detected.
Ambient Returns the ambient light level detected.

You must select the appropriate mode for your application and you will probably need to experiment to determine which mode works best.

As we have done with the other sensors, we have a library class called ColorSensor that simplifies using the EV3ColorSensor. Create a new class called ColorSensor in the library package and copy this code into that class.

Now create a new package called ev3.exercises.colorDemo and in that package add a class called ColorDemo. Copy the following code into that class:

This program demonstrates each mode of the Color sensor. After the wait for start, the ambient light intensity is displayed. You can hold the EV3 in your hand or better yet put on various surfaces to see the values returned by the sensor. When done with ambient, press the escape button to move to the next mode. That mode measures the red light intensity with the red LED turned on. Press the escape button again to move to displaying the RGB color detected. Note we turn on the white light on the LED to better detect actual surface color. Press the escape button again to move to detection of a single color value. This color value is a numeric value, so to make our lives easier, the ColorSensor class has a method to convert the numeric color value to a color name.

 

Navigation:

Pages