RoboCup Junior Australia & Somerville House Rescue Maze … · 2017-07-05 · EV3-G and NXT-G ......

23
©RoboCup Junior Australia 2017 [1] 2017 RoboCup Junior Australia & Somerville House Rescue Maze Workshop 6 th March 2017

Transcript of RoboCup Junior Australia & Somerville House Rescue Maze … · 2017-07-05 · EV3-G and NXT-G ......

©RoboCup Junior Australia 2017 [1]

2017

RoboCup Junior Australia & Somerville House

Rescue Maze Workshop 6th March 2017

©RoboCup Junior Australia 2017 [2]

2017

Contents Introduction to the Maze Challenge .......................................................................................................................................... 4

The Challenge ........................................................................................................................................................................ 4

The Playing Field .................................................................................................................................................................... 4

Scoring Points ........................................................................................................................................................................ 5

Analysis ...................................................................................................................................................................................... 6

Top Down Design .................................................................................................................................................................. 6

MAZE ..................................................................................................................................................................................... 6

Traversing the Maze .......................................................................................................................................................... 6

Detecting Obstacles and floor conditions that impact movement ................................................................................... 6

Detecting Victims .............................................................................................................................................................. 6

Delivering Rescue Packages .............................................................................................................................................. 6

Basic Maze Skills ........................................................................................................................................................................ 7

1 Using Sensors to detect walls, objects and floor conditions .............................................................................................. 7

Method 1 – Wait For ......................................................................................................................................................... 8

Method 2 – Loop Conditions ............................................................................................................................................. 8

Method 3 – Switch Block ................................................................................................................................................... 9

Detecting Floor State ........................................................................................................................................................ 9

2 Using a touch sensor to pause the program .................................................................................................................... 10

A Listener ........................................................................................................................................................................ 11

Summary ......................................................................................................................................................................... 12

3 Detecting Victims by Colour ............................................................................................................................................. 14

Using Colour Mode.......................................................................................................................................................... 14

Using Reflected Light Mode ............................................................................................................................................ 14

Advanced Maze Skills .............................................................................................................................................................. 15

4 Detecting Victims by Heat ................................................................................................................................................ 15

5 Climbing and Descending Ramps ..................................................................................................................................... 16

Wheel Base ...................................................................................................................................................................... 16

Centre of Gravity ............................................................................................................................................................. 16

Sensor Readings .............................................................................................................................................................. 16

6 Mapping the Field (knowing where you are!) .................................................................................................................. 17

7 Maze and Pattern Searching ............................................................................................................................................ 18

8 Delivering a Rescue Package ............................................................................................................................................ 20

©RoboCup Junior Australia 2017 [3]

2017

Using a Conveyor Belt ..................................................................................................................................................... 20

Using an Actuator ............................................................................................................................................................ 20

Putting It All Together ............................................................................................................................................................. 21

Other Software Options .......................................................................................................................................................... 22

EV3-G and NXT-G ................................................................................................................................................................. 22

Robot-C, NXC ....................................................................................................................................................................... 22

MATLAB/Simulink ................................................................................................................................................................ 22

Open-C ................................................................................................................................................................................. 22

Other Hardware Options ......................................................................................................................................................... 23

LEGO Mindstorms ............................................................................................................................................................... 23

Custom Builds ...................................................................................................................................................................... 23

©RoboCup Junior Australia 2017 [4]

2017

Introduction to the Maze Challenge The Challenge There has been an accident at a manufacturing plant. There are a number of

victims still trapped within the plant and it is too hazardous to send in human

rescue teams. Your autonomous robot must be able to navigate through a

treacherous building with obstacles, uneven floors and restricted areas to

identify victims and (optionally) leave rescue packages to aid anyone still

trapped. Time and technical skills are essential! Come and prepare to be the

most successful Rescue Maze Response Team.

The robot needs to search through a maze for colour-identifiable or heated

victims. i.e. the robot should not find the fastest path through the maze,

instead it should explore as much as possible of the maze. The robot will get

between 10 and 25 points for each victim found. If the robot can also deliver

a Rescue Package (designed by the team themselves) close to the victim it will earn an additional 10 points. The robot

should avoid areas with black floor.

If the robot is stuck in the maze it can be restarted at the last visited checkpoint. The checkpoints are indicated with

reflective floor so the robot can save its map (if it uses a map) to a non-volatile medium and restore it in case of a restart,

optimising the robot's search.

If the robot can find its way back to the beginning after exploring the whole maze it will receive an exit bonus.

The Playing Field The maze may consist of multiple distinct areas. Areas will have a horizontal floor and a perimeter wall. Areas may be

joined together by doorways or ramps. Walls that make up the maze are at least 15cm and up to 30 cm high. Doorways

are at least 30 cm wide. Ramps will be at least 30 cm wide and have an incline with a maximum of 25 degrees from

horizontal surface. The ramp is always straight.

Floors may be either smooth or textured (like linoleum or carpet), and may have up to 3 mm height difference at joints.

There may be holes in the floor (about 5 mm diameter), for fastening walls. A tile is defined as a 30 x 30 cm space, which is

aligned to the grid made up by the walls. There may also exist silver tiles that represent Checkpoints. Additional

Checkpoints will be placed randomly at the start of each round. Silver tiles may not be completely fixed on the floor.

Throughout the arena, there may exist black tiles that represent “no go”

spaces. Black tiles will be placed randomly at the start of each round. Black

tiles may not be completely fixed on the floor.

Walls may or may not lead to the entrance/exit. Walls that lead to the

entrance/exit are called linear walls. The walls that do NOT lead to the

entrance/exit are called "Floating Walls". Paths will be approximately 30 cm

wide with, but may open into foyers wider than the path. One of the

outermost tiles is the starting tile, where a robot should start and exit the

run. The starting tile is always a checkpoint.

©RoboCup Junior Australia 2017 [5]

2017

Floor Obstacles are fixed to the floor, have a maximum height of 2 cm and will be no closer than 5 cm apart. Debris will

not be fixed on the floor, and have a maximum height of 1 cm. Debris can be placed anywhere on the tile. Room obstacles

have a minimum height of 15 cm, may consist of any large, heavy items and its can be any shape, including rectangular,

pyramidal, spherical or cylindrical. Obstacle will not prevent a robot from discovering routes in the maze. An obstacle may

be placed in any location where at least 20 cm is left between the obstacle and any walls. Obstacles that are moved or

knocked over will remain where they are moved to/fall and will not be reset during the scoring run.

Teams should expect the environmental conditions at a tournament to be different from the conditions at their home

practice field. Teams must come prepared to adjust their robots to the conditions at the venue.

Scoring Points Action Completed Score

Victim Identified – Colour ID or Heated – Linear Wall 10

Victim Identified – Colour ID or Heated – Floating Wall 25

Rescue Kit Deployed to victim 10

Checkpoints 10

Exit Bonus 10 x n (identified victims)

©RoboCup Junior Australia 2017 [6]

2017

Analysis Top Down Design Starting with the whole problem, look to break it down into smaller, sub-problems. If these are still too complex to solve,

break them down even further.

MAZE

Traversing the Maze How can the robot drive through the maze? What sensor are needed for successful maze navigation? What motors? What

is the maximum size the robot can be and still traverse the maze? What technology/software solution will give the best

results for me?

Detecting Obstacles and floor conditions that impact movement How is my movement through the maze affected by the possible impediments? How do I respond to a black floor? A silver

floor? Floor level debris?

Detecting Victims What sensors do I need to detect victims? How can I see all sides?

Delivering Rescue Packages What mechanism will I need to activate to deliver a Rescue Package when I do detect a victim? How does this impact on

the build for my robot?

©RoboCup Junior Australia 2017 [7]

2017

Basic Maze Skills 1 Using Sensors to detect walls, objects and floor conditions There are a huge range of sensors available to use with this challenge. Focussing on the EV3 kits, the following come with

the standard education kit:

Additionally, there are third party sensors available from suppliers like HiTechnic or Mindsensors:

Vision Sensor (Camera) Pixy Cam IR Temperature Sensor Accelerometer/Compass

The first task is to determine which sensor will give the robot the most accurate feedback for the task at hand. For

example, a touch sensor could be used to determine if there is a wall in front of the robot (bumper bar), or an ultrasonic

or vision system could be used to track a set distance from a wall.

Let’s start by using an Ultrasonic Sensor to see a wall in front of the robot.

This sensor array carries a Colour Sensor facing down, a Colour Sensor facing

forwards, an Ultrasonic Sensor facing forwards and has a space for the Mindsensors

Thermo Sensor directly below the Ultrasonic Sensor. Place all sensors and motors as

per the table below:

Sensor/Motors Port

Thermo Sensor 1

Gyro Sensor 1 when using it (unplug Thermo Sensor)

Touch Sensor 1 when using it (unplug Thermo Sensor)

Colour Sensor Forward 2

©RoboCup Junior Australia 2017 [8]

2017

Colour Sensor Down 3

Ultrasonic Sensor Forward 4

Motors B & C (same side)

To detect a wall, we need to use the data from the sensor to control the motor actions.

Method 1 – Wait For The wait block can be used to wait until a sensor reading meets a set condition. For the ultrasonic sensor, we

can use the Wait block to wait until the distance is less than a set value. To test this, set the motors on (use the

Tank Movement block) at 50% power, then place a Wait block.

Motors On at 50% Wait for Ultrasonic (Compare cm) Motors Off

Putting it all together:

Method 2 – Loop Conditions The same as above can be achieved by a loop with a condition to end the loop. In this case we want the

Motors On (Tank Movement On at 50%) inside the loop. To end the loop, use the same settings as for the

Wait Block – Ultrasonic – Compare (cm). The whole program will look like the following:

©RoboCup Junior Australia 2017 [9]

2017

Method 3 – Switch Block The third way to achieve the same result is to use a switch (decision) block, with the same condition as before – ultrasonic

– compare (cm). For this the TRUE (✓)side of the switch block indicates that the robot is within 20cm of the wall, so here

we use the Motor Off block. On the FALSE () side of the decision we use the Motors On block. This will look like:

Note that to make this work the Switch Block must be inside an INFINITE LOOP. If there was no Loop Block around this

program, what would happen?

Detecting Floor State The floor state is also important. The robot must not travel across a BLACK floor (void), and should indicate through

flashing lights that is has found a SILVER floor (checkpoint). The Colour Sensor facing down can detect the standard floor,

black floor and silver floor and as such is good for this task. It operates in exactly the same way as detecting the victim by

colour, so we will hold off on discussing this further at this time.

©RoboCup Junior Australia 2017 [10]

2017

2 Using a touch sensor to pause the program A touch senor can detect three states for the red button on the front of the sensor:

0. Released – the sensor button has been let out.

1. Pushed – the sensor button has been pushed in.

2. Bumped – the sensor button has been both pushed and released.

There are many approaches to pausing the program with a sensor, and we will look at just one approach here.

To begin, do we need to pause the program after EVERY action? Or can we pause at certain intervals, knowing that

resuming from these intervals are good enough?

In this program, we will only pause just after the start of the Loop Block. Our program will be simple. The robot will drive

in a square pattern, and will pause when the touch sensor is pushed and resume once it is pushed again.

The basic program to drive a square is shown below – it uses Move Tank – On for Rotations set at 50% power. It also uses

a loop inside the main loop, since a square is made done by repeating the instructions DRIVE STRAIGHT, TURN RIGHT

ANGLE four times.

One strategy to insert a pause on touch would be to put the Move Blocks for this program inside a switch (on the FALSE

line). The switch condition would be TOUCH SENSOR COMPARE STATE. Set the state to 1 (pushed). On the TRUE line

have a Wait Block (set it to five seconds). Test to see what happens…

©RoboCup Junior Australia 2017 [11]

2017

Did the robot stop?

You might have noticed that if you just pushed the sensor in and let go straight away, the robot did not pause – it just kept

on going! However, if you held the sensor in until the robot had finished turning, the robot did indeed stop for five

seconds. This is because we are only collecting the touch data at a single point in time. While the robot is executing the

Move Blocks it is not reading any sensor data (not listening).

If we want the robot to be always listening, we need to create a second program that runs at the same time as the main

program. This second program will be listening all the time for the touch sensor to be pushed, and when it is, it will pause

the program at the appropriate time (not immediately). However, this introduces a new problem…How can the second

program tell the first program the touch sensor has been pushed?

Luckily we have a solution for this – variables! We can store the STATUS of the touch sensor in a

TRUE/FALSE Variable (these are called LOGIC Variables in EV3-G) Block. There are a number of options, so

we need to understand how variables work in EV3-G. A Variable Block can hold a VALUE for a selected

VARIABLE NAME. This VALUE can be text, a number, a logical value (True/False) or an array of values (an

array is a bunch of variables of the same type all stored together).

Additionally, a Variable Block can be READ (we read the data to use it to make decisions) or can be WRITTEN (we store the

value into the variable). It is important that we use the Variable Block in the correct mode, otherwise it just won’t work!

A Listener To write a second program, we need a second start block. When your program downloads and runs, it will start BOTH

programs at exactly the same time, and they will both run together. The listener will ONLY listen for the Touch Sensor

PUSH event, and will WRITE a value to the Variable Block when it detects a touch.

©RoboCup Junior Australia 2017 [12]

2017

To start the listener we should INITIALISE our variable. This is where we set the Variable Block (create a variable named

TOUCH_PAUSE of Logic Type, mode WRITE, and set the Value to FALSE (no touch detected). This ensures that we don’t get

an accidental program pause at the very start. It is good practise to set an initial value for all variables.

Next, we need to listen for the rest of the program. This will require an infinite loop. Inside the loop we have a Wait Block

that is waiting for the Touch Sensor to receive a PUSH event. Once it does, we WRITE to the Variable Block the value TRUE

(sensor was pushed). The listener has done it’s job!

So, how do we use this value in the main program?

Instead of the Switch Block checking if the Touch Senor had been pushed, we will now use the Variable Block in READ

mode and use this value to determine what the Switch Block will do.

The Variable Block will need to feed its value into the Switch Block. We use a data wire to do this

(make sure the Switch Block is using Logic). Now when the Variable Block value is flase, the

program will continue moving in a square. Once the touch sensor is touched, the variable value will

be updated to TRUE and the switch block will run the upper path.

On this upper path, we need to pause the program and wait for some action to get the program started again. Initially it

seemed like a good idea to push the Touch Sensor a second time to get the program started again. This was not a good

idea! Every time the Touch Sensor is pushed it sets the variable value to TRUE, and pauses the program. The second idea

was much more successful – by using the buttons on the brick (in this case the centre button) to resume the program we

don’t trigger another Touch Sensor Push event. I also used this as an opportunity to change the colour of the lights on the

Ev3 brick to indicate the program was indeed in pause mode. Finally, before we go back to listening and square-driving,

we need to reset the variable to FALSE so that we are ready to wait for the next time the program needs to be paused.

The upper string ends up looking like this:

The last Block resets the EV3 lights to the default mode (flashing green when the program is running).

Summary There have been a few new concepts introduced to complete this task. The good news is that many of these concepts will

be useful in the rest of the programs needed to complete the Maze challenge.

©RoboCup Junior Australia 2017 [13]

2017

1. Variables. These are place-holders for values that we can use throughout our program.

2. Second Program. You can run multiple programs at the same time! Be careful, though, if two programs are trying

to move the same motor ports at the same time, what results can be unpredictable. Try to separate the jobs that

need to be done, but don’t share the same sensor or motor across multiple programs.

3. Data Wires. These are very useful – we can use sensor values (data) directly in the structures we use (loo, switch

etc.). This gives us many more options when writing a program.

4. Using Brick Lights and Buttons. This will be useful when it comes to identifying the victims we find. Instead of

adding lights to your EV3 robot, you can flash the lights in the Ev3 brick to prove that you have found a victim.

©RoboCup Junior Australia 2017 [14]

2017

3 Detecting Victims by Colour In Queensland, we have decided that the victims will be coloured BRIGT RED. This colour is easy to distinguish from most

background colours using the EV3 Colour Sensor (in either reflected light or colour mode). For the sake of this workshop,

we will detect YELLOW and ORANGE victims (paper squares – these will be flat to the wall, and matte rather than

reflective).

Using Colour Mode The LEGO EV3 Colour Sensor can detect true LEGO colours. Although the yellow and orange squares aren’t LEGO defined

colours, the Colour Sensor in MeasureColour mode will detect them as Yellow or Red (the sensor reports a number –

each number represents a colour detected, or 0 for no colour detected).

The wooden walls of the maze report back as 7-Brown. This is handy as we can see the victim. There is

a problem – as always! The sensor reports different colours at different distances from the wall. For

the workshop robot, the ideal distance was when the Ultrasonic Sensor was about 3.5 cm from the

wall (the Colour Sensor is considerably closer).

To see this, have the robot drive forwards slowly (speed 5 is enough) until the Ultrasonic Sensor reads

3cm (or less). Stop driving and check the colour of the victim patch on the wall. It should read 4

(Yellow) or 5 (Orange/Red). Run the program again, facing a blank wall. The reading should be 7

(Brown) or 1 (Black).

The following program puts this into action:

Using Reflected Light Mode The Colour Sensor can also be used to measure the amount of reflected light. This often gives us a much better scale,

accounting for variations in colours and tones. Very dark objects will return a very low number, as they absorb light and

reflect very little back to the sensor (Black may return a number around 4). Very light objects (white) or highly reflective

objects (alfoil/silver metallic tape) will absorb almost no light and reflect most of it back, resulting in high numbers (60-80

for white and 90-100 for metallic silver). Once again, however the closer the sensor gets to the wall, the higher the values

will be, and the more separation there will be between darker and lighter objects.

©RoboCup Junior Australia 2017 [15]

2017

Advanced Maze Skills 4 Detecting Victims by Heat This is the first task that we have seen that cannot be accomplished using standard LEGO EV3

sensors. There is an EV3 Thermometer Sensor, but it is a contact sensor, and the measurement

takes a reasonable amount of time (a few seconds at the very least). A better type of sensor

would be a Thermal Infrared (Thermo IR) Sensor. There are a few on the market – the one being

used on the workshop robot is from Mindsensors, available through the MTA catalogue.

The approach to see the victim here is very similar to that for detecting a victim by colour. To

demonstrate the sensor working, the following program shows the target temperature (in front of the sensor) on the

screen:

The sensor calculates an ambient temperature, and this can be used to determine if the target temperature is above

ambient, and by how much. You will note that the target temperature reduces rapidly with distance from the object, so it

is important that temperature readings are taken close to the heat pad (victim). This is not too much of a problem as the

robot size will probably keep the sensor close to the wall.

Write a program that has the robot turn around until it sees a heat source more than 5 oC above ambient temperature.

When it sees this object, stop turning and set the EV-3 brick lights to red and solid colour for 3 seconds.

This program uses a variable to store the difference between the ambient temperature and the target temperature. If it is

more than 5 the robot will stop turning and set the lights to red for 3 seconds.

©RoboCup Junior Australia 2017 [16]

2017

5 Climbing and Descending Ramps As the Qld maze will not include any ramps in 2017, this will not be specifically covered during this workshop.

Some things to consider when climbing and descending:

Wheel Base This is the distance between the Front and Rear wheels for a vehicle. The standard robot we have used for this workshop

has two LEGO wheels with balloon tyres attached to motors, and the ball bearing trolley wheel at the rear. The distance

between these wheels is the wheel base. If it is too small, the robot could topple over when tilted up or down. If it is too

long, the robot could become ‘beached’, where the body of the robot touches the ground, effectively lifting the drive

wheels off the ground. No drive wheels, no motion!

Centre of Gravity This is related to a smaller wheel base, needed to prevent beaching the robot. If there is too much weight placed up high,

then the robot could fall over, or at the very least become unstable. The lower to the ground the heavier components, the

more stable the robot will be once tilted.

Sensor Readings Several sensors can be affected by tilting. Compass and Gyro sensors are particularly sensitive. They can produce errors up

to 10% with a 25o tilt angle. Additionally, the colour sensor pointing down will give different values at the top and bottom

of the ramp, as the sensor gets closer or further away from the floor. This might be overcome by better placement if the

sensor. Understanding the source of errors can be useful when it comes to account for them.

©RoboCup Junior Australia 2017 [17]

2017

6 Mapping the Field (knowing where you are!) This is the golden ticket for Rescue Maze. This gives the robot the ability to map out the rooms, and then use the map to

help navigate through tiles that have already been visited, or to return to the exit tile to get the exit bonus.

This workshop will not solve this problem for you, but will introduce a technique that might be useful. This technique for

mapping is using an array of numbers, where the POSITION and the VALUE both indicate some aspect of the room. It could

result in an 8-digit number, with the digits representing:

West Wall Status

North Wall Status

East Wall Status

South Wall Status

(0 - unknown, 1 - no wall, 2 - wall, 3 - victim on wall)

Floor Status

(0 – standard floor, 1 – Black floor, 2 – silver floor)

X Coordinate

Y Coordinate

(Start tile is X=0, Y=0, increment by 1 when moving in X or Y direction)

Visited Status

(0 – not visited, 1 – visited)

So a room could generate the following 8 numbers:

0 1 0 1 0 1 2 0

This room has walls on the north and the south, no victims, standard floor and is located one room to the right of the

entrance and two rooms north of the entrance, and this room has not previously been visited (this will change to a 1 as we

leave the room).

By storing each room as an array of 8 values, it would be possible to build up an entire map. Calculating a return to the

entrance could be done by following the rooms visited in reverse order. Once the robot returns to room X=0, Y=0 it would

stop for ten seconds then shut off the program, gaining an exit bonus.

©RoboCup Junior Australia 2017 [18]

2017

7 Maze and Pattern Searching The most common technique for finding your way through a standard maze is wall-tracking – that is, imagine you are in

the maze and place your left hand on the wall. Continue travelling through the maze always keeping your left hand on the

wall. In a true maze you will eventually find your way through to the end. This is because a true maze has no ‘floating

walls’ (walls not connected to the outer edge).

The first approach might be to measure the length of a tile, and have the robot move using wheel rotations or rotation

degrees from centre of tile to centre of tile (a distance of 30cm). This would also require working out the degrees of wheel

rotation needed to turn exactly 90 degrees (right angle).

This is achieved through TEST programs. Work out how far the robot travels over 360 degrees, then calculate the degrees

rotation required for 30cm.

Following the testing, create a program that effectively sticks to the LEFT WALL of the maze, using the calculated distances

and turns. The first Tank Block moves the robot into the maze from the start tile. Since there will always be no wall 180

degrees from where the robot is pointed (we just travelled through that gap to get to this tile), turn the robot around 180

degrees, then start searching every 90 degrees turn for a gap in the wall. The first gap found is the one the robot will take.

This technique has at least one obvious flaw! If the turns are even slightly off 90 degrees, or the forward distance is even

slightly over or under 30cm, then with every tile this small error is magnified. We need to find a way to correct the robots

position and orientation every now and again.

©RoboCup Junior Australia 2017 [19]

2017

One method might be to use the ultrasonic sensor to measure the distance to a wall, and reset to a known distance from

every wall it sees. This should push the robot back into the centre of the tile. This will not correct the turns, but might help

compensate a little for any distance errors.

Unfortunately, this maze searching technique won’t work on some of the Rescue Maze layouts, since they are allowed to

include ‘floating walls’.

Simple (True) Maze – No floating Walls Complex Maze – Floating Walls

On the second maze, following the wall will not allow the robot to visit five tiles in the centre of the maze. Another

solution is needed…this is something we WON’T solve today.

©RoboCup Junior Australia 2017 [20]

2017

8 Delivering a Rescue Package The rescue package can be anything at least 0.5 cm3 or more in volume. In Western Australia at a similar workshop to this

they used M&M chocolates as the rescue package. Here we have decided to use Starburst Fruit Chews (because they are

rectangular prisms and wrapped!). You could use a 2x2 LEGO block, or virtually anything you can imagine so long as it

meets the size requirement.

Once you have detected a victim by colour or by heat, the robot needs to deliver one rescue package. This really is a

hardware system that can be built on or into the robot design. I have two such systems, each with their own problems, to

demonstrate the principle. Neither is attached to a robot and neither one would be ideal for this challenge – they are

simple examples of mechanisms that complete the task.

Using a Conveyor Belt The first system uses a conveyor belt to move each rescue package

forward, until it falls off the front. Using the LEGO EV3 tracks with 3M

Friction pegs, it is possible to squeeze the Starburst lollies into each

section. These release when the belt flows around and under the track

system.

As a basic concept this seems to work well, however, it needs refinement.

A conveyor using a full 15M beam only really holds 8 rescue packages

securely. This might not be enough.

Using an Actuator Another method is to build a chute and essentially ‘push’ rescue packages out of the chute with an actuator. This allows us

to build a chute as long as needed to hold however many rescue packages you want (the maximum in the rules is 12).

This also makes it easier to drop the package next to the victim, and there are some ways that this can be easily

incorporated into the original robot design. The drawback of the system here is the length – it is essentially 30M long (2x

15M beams). When a motor is added to control it this could be quite a large contraption.

©RoboCup Junior Australia 2017 [21]

2017

Putting It All Together To successfully navigate the Maze and locate all victims, deploying to each a rescue package, traversing the whole Maze

and locating all checkpoints is not a trivial task. In fact, it would be rare for any team to do all of this on any run through

the Maze.

So for most teams it will be about making choices. The first choice is to put all the sub-tasks into a PRIORITY list, where the

most important tasks are at the top of the list, down to the least important tasks at the end.

Implement your solution to accommodate these sub-tasks in the order you set. You might abandon the idea of finding all

the checkpoints, using a random grid search pattern to hopefully cover the whole maze before finding the entrance/exit

tile. You might decide that deploying rescue packages was not possible as they constantly got stuck under the wheels of

the robot (as mine did!).

Whatever set of priorities you set, use them to guide the method you use to incorporate each element into the main

program. For some you can use a Loop Block, others a Switch Block or a Wait for Sensor Block. You might create your own

My Blocks to improve the readability of your code, or create separate programs to ‘listen’ for sensor triggers, and then

break into the main program to accomplish a task.

The secret to putting it all together to maximise your chances of success is to plan carefully and re-evaluate the plan

regularly. Don’t test only once everything has been written into the code. For every change made, test to see the impact.

Did the change improve your response? If yes, keep the change and move on. If no, analyse why. Is there a sensor value

that needs to be reset? Have I left some test code in the final copy? If the analysis determines that the approach won’t

work (ever) remove the change and use the information you have gathered to try a different approach.

Rescue Maze is a complex task, and everyone will be approaching it in a different way, using different technologies. The

best way to approach it is to use good planning and good communication between team members to attack the task one

sub-task at a time!

©RoboCup Junior Australia 2017 [22]

2017

Other Software Options EV3-G and NXT-G These are the standard LEGO platform languages for NXT and EV-3 robots. Whilst they offer many advantages (ease of use

being the major advantage) they also restrict us (Mathematical operations are tricky and complex).

Robot-C, NXC The standard C language for robots is Robot-C. This allows much greater flexibility, particularly with non-LEGO robots.

Anyone familiar with any of the C-based languages will be able to manipulate Robot-C for the LEGO platform as well. It is,

however, a language with a relatively steep learning curve – not a beginner level language!

MATLAB/Simulink This is for the serious teams! MATLAB is used in universities, in industry and in high-end research facilities around the

globe. Simulink is used in most Flight Simulators, and for live processing of data.

Any RoboCup Junior Australia team has FREE access to the full MATLAB/Simulink package through our sponsorship

agreement with MathWorks. There are numerous guides to connect and run the EV-3 and NXT robots with

MATLAB/Simulink. For teams that want to go to the next level, write full production scripts and use process engineering

procedures to control their robot, this is the way to go.

Open-C This language has a series of libraries that are excellent for working with vision systems. The libraries make some standard

vision tasks quite simple, for example identifying objects, tracking objects and filtering out the input by colour. There is

much promise here but a very advanced set of skills required.

©RoboCup Junior Australia 2017 [23]

2017

Other Hardware Options LEGO Mindstorms This workshop has focussed entirely on the LEGO EV-3 hardware. As you have seen, to meet this challenge fully this

hardware does not feature enough sensor ports. There are options (e.g. sensor multiplexors or multiple EV-3 bricks) but

these add a further level of complexity to an already complex task.

Custom Builds Many teams will meet this challenge building their own robots, based around a commercial processor, like the Raspberry

Pi, Arduino or Edison. Buying custom sensors and building from scratch is very appealing – you can build what you want,

how you want, in the shape you want with all the sensors you want (within reason). Using 3D printers and Plastic or

wooden platters, it is possible to build elaborate and complex robots. However, the more complex the robot, the more

complex the programming, and the more things that can go wrong! Certainly not for the beginner, but there are some

good projects in the marketplace that help build you up to a Custom Maze Robot.