You are on page 1of 107

A Petri Net based Lego for searching

Vegard Torkelsen and Jonathan Brian T. Alcoriza

May 2012

Summary (Abstract)
The aim of this project is to create a robot which implements a searching routine to nd objects in an area, removing them from the search area. The search-

ing routine will be fully autonomous, meaning that the robot will nd objects and remove them, making all the decisions and calculations by itself with no interaction from the user. Lego Mindstorms NXT is a programmable robotics kit by LEGO. In this project the Lego Mindstorms NXT 2.0 is used, which is the newest set in LEGO's Lego Mindstorms series. The NXT we are going to design will have 2 wheels on individual motors and will use an ultrasound sensor to nd the objects. The NXT will calculate and nd the object closes to its

current position and drive towards the closes object. We constructed a grabbing mechanism, a claw which uses a third motor to open and close. After the NXT has found and positioned itself in front of the object, it will close the  claw and drive to edge of the search area. We will be using a Bluetooth connection to communicate between the NXT and a computer. Petri NET and GePEN sim. This is implemented using

Acknowledgements
We would like to thank our teacher prof. Reggie Davidrajuh for guidance and tips on GpenSim and Stle Freyer for lending us the Lego Kit and helping us when we had questions about the Lego and setting up the connection.

Contents
1 Introduction
1.1 1.2 1.3 Problem denition Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

5
5 5 6

Relevance and adoption

2 Background
2.1 2.2 2.3 2.4 2.5 2.6 Lego Mindstorms NXT . . . . . . . . . . . . . . . . . . . . . . . . MATLAB . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . RWTH - Mindstorms NXT Toolbox for MATLAB . . . . . . . .

7
7 10 10 10 11 11

Petri Net and DEDS . . . . . . . . . . . . . . . . . . . . . . . . . GpenSIM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Autonomy and Articial Intelligence

3 Method and design


3.1 Implementation and coding . . . . . . . . . . . . . . . . . . . . .

12
13

4 Testing, analyzing and result


4.1 4.2 4.3 4.4 Example run . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Limitations and challenges . . . . . . . . . . . . . . . . . . . . . . Analyzing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Result . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

35
35 59 61 62

5 Discussion
5.1 Connection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

63
63

CONTENTS

5.2 5.3

Petri Net

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

63 64

Further work

6 Image Reference Bibliography A Building guide B User manual for Lego NXT Clean-up robot
B.1 B.2 B.3 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . The search area . . . . . . . . . . . . . . . . . . . . . . . . . . . . Adjusting and starting the program . . . . . . . . . . . . . . . .

66 66 69 70
70 70 72

C Installation guide for Lego NXT Clean-up robot


C.1 C.2 C.3 C.4 C.5 C.6 Installing software . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . RWTH Mindstorms NXT Toolbox

73
73 73 74 74 75 75

Installing the program . . . . . . . . . . . . . . . . . . . . . . . . Setting up the bluetooth.ini le . . . . . . . . . . . . . . . . . . . Connecting to the NXT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Troubleshooting the bluetooth connection

D MATLAB code
D.1 D.2 D.3 Action Commands . . . . . . . . . . . . . . . . . . . . . . . . . . Main program . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Transitions (in alphabetical order) . . . . . . . . . . . . . . . . .

77
77 92 95

Chapter 1
Introduction
This project was created for the subject MID-280 Discrete Simulation and performance analysis spring semester 2012 by Vegard Torkelsen and Jonathan Brian T. Alcoriza.

1.1 Problem denition


The main issue of this project is to create a robot using Lego Mindstorms NXT that will implement a searching routine which has been created using GpenSIM and Petri Net. Petri Net is a discrete event dynamic system. We wish to create a robot that is as autonomous as possible - the robot should be able to make decisions on its own. It will be interesting to see how well GpenSIM and Petri Net works in this eld and if GpenSIM should be used in more robotics. In

this project the Lego Mindstorms NXT 2.0 will be used, which is the newest set in LEGO's Lego Mindstorms series. The NXT we going to design will have 2 wheels on individual motors and will use an ultrasound sensor to nd the objects.

1.2 Motivation
We are both studying Cybernetics and played with Lego when we were younger. Robots and new technology are interesting for us, and being one of the rst

CHAPTER 1.

INTRODUCTION

students to get to program and try out the Lego NXT at the University of Stavanger is a good opportunity for us to try something new.

1.3 Relevance and adoption


Robots which are fully (or almost fully) automated are being used more and more in everyday life as well as in industrial work. Some examples of real life AI[4]: - Warehouses with mobile robots that store and fetch requested items - Agents outperforming humans in trading on the stock exchange - Automated scanning and interpretation of X-ray images in medicine - Image matching and recognition in mobile Internet search - Neural networks for fraud detection in banking transactions - Autonomous cars navigating and driving unassisted in normal trac In this project, the robot created has been inspired by mobile robots in warehouses that looks for items, and puts the items at the required area.

Chapter 2
Background
2.1 Lego Mindstorms NXT
Lego Mindstorms NXT [8] is a programmable robotics kit by LEGO. In this project the Lego Mindstorms NXT 2.0 was used, which is the newest set in LEGO's Lego Mindstorms series.

2.1.1

NXT Intelligent Brick

The main component of the Lego Mindstorm kit is the NXT intelligent brick, which is a brick-shaped computer. It can get input from up to four dierent

sensors[3] and three outputs which can control up to 3 motors using RJ12 cables. It has a monochrome screen which is 100x64 pixel and four buttons with menus using hierarchy. The brick also have a speaker that can play sound les.

CHAPTER 2.

BACKGROUND

Figure 2.1: NXT Intelligent brick

Technical specications: - 32-bit ARM7 micro controller - 256 KBytes FLASH, 64 KBytes RAM - 8-bit AVR micro controller - 4 KBytes FLASH, 512 Bytes RAM - Bluetooth wireless communication - USB full speed port, 12MBit/s - 4 input ports, 3 output ports, 6-wire cable digital platform - 100 x 64 pixel LCD graphical display - Loudspeaker, 8kHz

2.1.2

Sensors

There are many Lego sensors[3] that can be bought separately and third-party sensors, like color sensors, sound sensor, compass, gyroscopic, RFID reader, accelerometer and temperature measurer.

CHAPTER 2.

BACKGROUND

Figure 2.2: Push Sensor

- Push sensor - can detect whether it is being pressed, released or bumped.

Figure 2.3: Ultrasonic sensor

- Ultrasonic sensor - measures distances in centimeters and in inches and it can detect movement. The sensor can measure distances from 0 to 233 cm with a precision of 3 cm. The ultrasonic sensor measures the distance by calculating the time it takes for a sound wave to hit an object and returns.

Figure 2.4: Light sensor

- Light sensor - detects light levels by sensing the reected light using a build-in red LED or ambient light. It can also read light intensity of colored surfaces.

CHAPTER 2.

BACKGROUND

10

Figure 2.5: Servo Motor

- The servo motors have built in rotary encoders that senses the rotation of the motor with an accuracy of one degree.

2.2 MATLAB
MATLAB[10] is a high-level programming language for numerical computing, data acquisition and analysis. It can be used to control LEGO NXT robots

over a Bluetooth serial port (serial port communication is part of the base functionality of MATLAB) or via a USB connection. There are several papers found in [14] that take on the topic of LEGO NXT programming in MATLAB.

2.3 RWTH - Mindstorms NXT Toolbox for MATLAB


RWTH toolbox [14] is developed to control Lego Mindstorms NXT robots with MATLAB via a USB connection or over a Bluetooth serial port. RWTH toolbox is a free open source software. The toolbox functions are based on the Lego

Mindstorms NXT Bluetooth communication protocol to control the intelligent NXT Brick via a wireless Bluetooth connection or via USB.

2.4 Petri Net and DEDS


Petri Net (Place/transition net) is a mathematical modelling language for describing distributed systems. Petri Net oers a graphical notation for stepwise processes that include choice, iteration and concrete execution. Petri nets are a popular way for modelling concurrency and synchronisation in distributed systems. Petri Net is a discrete event dynamic system[5] (DEDS), which are

CHAPTER 2.

BACKGROUND

11

dynamic systems that are not synchronized (asynchronous, not driven by a clock), but moves forward with the occurrence of events.

2.5 GpenSIM
General purpose Petri-net simulator[1] is a Petri Net simulator, which satisfy the three criteria of exible, extensible and easy to use. GpenSIM is a toolbox in the MATLAB platform. Diverse toolboxes like Fuzzy toolbox and Control system toolbox can be used in the models developed with GpenSIM. There are other tools for discrete event simulation like Automata, Stateow and Petri Net, but unlike GpenSIM, these tools are stand-alone systems and integrating with other types of tools like control systems is not possible.

2.6 Autonomy and Articial Intelligence


Autonomous robots are machines which can do task at environments that has no set structure and without or limited human interaction. There are dierent levels of autonomy[2]; from the system oering no assistance at all - leaving all the decisions and actions to the user, to the computer decides everything and acting autonomously, completely ignoring the user. A high level of autonomy is usually required at dicult environments like in space, deep sea, sewage systems etc. At high level autonomy the robot can get information about the environment, work for an extended period without human interaction, move itself or parts of itself without being controlled by a human and avoid hurting itself, humans or the environment (unless this is what the robot is specically designed for). Why use autonomy? Less workload for the user by letting the system do the mundane decisions, while the user makes the important decisions. This will help increase the eectiveness and lower the cost.

Chapter 3
Method and design
We used a simple robot with two individually rotating wheels by using one motor for each wheel with a balance wheel behind. The robot has 4 sensors: push sensor, light sensor, ultra sonic sensor and rotational sensor in the servo motors. The push sensor is placed on top and pressing this will start the program. The push sensor was added so that the robot would not start if we accidentally started the program from the computer. The light sensor will be facing downwards to the oor. The light sensor will detect when it is about to drive outside of the search area which are marked up. The ultrasonic sensors is placed were

the front is considered, it is like the eyes of the robot. The ultrasonic sensor are what are used to nd objects. The light sensor does not work well to nd objects, because you will need special lighting conditions and the search objects have to be dierent. The rotational sensor will be used to calculate positions and angles of the robot. We are going to design the robot to have a exible autonomous behavior, meaning that the robot have a main goal which is nding objects in the search area, but we want it to know what to do if it does not nd an object and nds an object outside the search area. We will do this by planning and optimizing action sequence to achieve our goals. We will modify our plans as we go further into the programming. We will improve by learning from experiences by debugging and analyzing past experience to detect potential for improvements and implement the improvements in the system. While testing and analyzing we found that using balls as search object were easiest for the ultrasonic sensors to read when scanning for objects. Another method that can be used, is making the

12

CHAPTER 3.

METHOD AND DESIGN

13

robot move in a specic pattern or follow a path like in [11]. We chose not to use this method since we wanted a more autonomous behaviour.

3.1 Implementation and coding


3.1.1 Bluetooth
To communicate with the NXT with MATLAB we used a serial port prole which works like a virtual serial port. Through serial port commands we can send and receive data to MATLAB. We connect the bluetooth we used a .ini-le which had setting for MATLAB which was placed in the current directory. bluetooth.ini le: [Bluetooth] SerialPort=COM23 BaudRate=9600 DataBits=8 SendSendPause=5 SendReceivePause=25 Timeout=2

The variable 'SerialPort' is where the bluetooth is connected, this is the most important part of this le. We used Windows XP were it is a COM-port, in our case COM-port 23. For further explanation see Appendix C.

3.1.2

Main MATLAB codes

The main program are in these les found in Appendix D.2, for more information on how to run the program see Appendix B:

CONNECT.m

This le is the initiation le; it connects the NXT to the PC using bluetooth. It implements the bluetooth.ini le. Running this .m-le will reset all connections and sensor and connect the NXT to the bluetooth unit.

NXT_Clawbot_def.m

This is the denition le of the Petri Net. It includes arcs, places and transitions.

CHAPTER 3.

METHOD AND DESIGN

14

NXT_Clawbot.m

It includes the initial token denition and initial ring denitions.

initNXT.m

Initiation le for the NXT. It sets the dierent global variables needed to run the robot and sets the dierent ports to motors and sensors. See Appendix D for more information and all the MATLAB code.

3.1.3

Modules

There are several ways to detect objects, in [12] a camera has been used to detect faces in images, a similar technique can be used to detect the balls, if the robot had been tted with a camera. We decide to go with a simpler approach, by using a sonar sensor as mentioned earlier. Another method worth mentioning is simultaneous localization and mapping (SLAM), which is used in [6]. This method would be to complicated and time consuming for us to implement. The scan sequence is the sequence which scans an area in front of the robot for objects. The robot starts by pressing the start button. It will turn 45 The NXT

degrees to the right and start searching while turning to the left.

is limited so it cannot read and send signals to its motors at the same time, so the robots turns to the left in small increments, reads from the ultrasonic sensor (see gure 2.3) and sends the reading to the computer and continues until its turned 90 degrees. For every rotational step, the robot saves the distance returned by the ultrasonic sensor and the distance it has rotated. The distance to the closes object is calculated and the NXT will position itself towards the object by rotating the calculated angle. The angle is calculated by taking all the saved distance points, nd the one closes to the NXT and take all points which only varies +/- 5cm from the closes one and nding the middle one. Then the NXT will drive forward until the calculated distance, grab the object and drive until it nds the edge of the search area (we created a search area by making a circle with A4 paper). When the border of the search area is detected the NXT stops and releases the object, reverses a bit, turns 180 and drives forward and starts the searching loop again. If all scanned points from the ultrasonic sensor are over 100 cm away, all points will be set to zero. This is done so that the robot does not detected too many

CHAPTER 3.

METHOD AND DESIGN

15

objects which are too far away. If the scan area size is increased, the threshold should increase. In the NXT_Clawbot_def.m le, we could not divide into

modules in the code, it would stop when trying to change modules. All the .m-les used can be found in Appendix D. The searching routine can result in four dierent scenarios which we have divided into modules:

Module 1:
Object detected normally: In this module, the robot detects a object in the search area at the initial scan, grabs the object and drops it o at the edge of the search area. After the robot has removed 5 objects, the program will stop.

Figure 3.1: Module 1

Module 1 event sequence:


1. Push button is pressed to initiate program. 2. The NXT starts to scan 3. Finds an object 4. Positions itself towards the object 5. Drives to the object 6. Grabs the object 7. Drives until edge is detected 8. Stops when edge detected 9. Drops object

CHAPTER 3.

METHOD AND DESIGN

16

10. Reverses 11. Turns 180 degrees 12. Drive forward 13. Back to 2 Here we set it to run through the loop 5 times. This can be seen at the arc

between P18 and tDone in gure 3.1. P19 works as a sink.

Figure 3.2: Sensor read when ball is in center

The points located between 15 and 25 in gure 3.2 is an object close to the robot, the point in the center of those points are marked red and is the center of the object it found. The robot will use this point to calculate which angle the servo motors have to rotate in order to position itself in front of the object if found in order to drive towards it.

CHAPTER 3.

METHOD AND DESIGN

17

Figure 3.3: Ball in center

Figure 3.3 is an illustration of the scan where the search object is found in the middle of the scan.

What happens in the transitions in Module 1:

tScan1_post.m is shown here:

function [] = tScan1_post(transition) global global_info; ACTION_scanDistance; b2 = 1; end

As shown in the code, the tScan1_post. m le calls ACTION_ScanDistance.m, where in the end of the code the edge detector is as seen here:

% check if target is at edge of scan-area. if distance(1) > 0 || distance(2) > 0 || distance(37) > 0 || distance(38) > 0

CHAPTER 3.

METHOD AND DESIGN

18

global_info.edgeDetected = 1 else global_info.edgeDetected = 0 end

Here the tDToTarget_pre checks that the edgeDetect from the code above is 0 and that there is a token at p5 as seen here:

function [fire, transition] = tDToTarg_pre(transition) global global_info; b4 = get_place('p5'); if global_info.edgeDetected == 0 && b4.tokens == 1; fire = 1; end

tDToTarget_post calls ACTION_DriveToTarget.m as seen here:

function [] = tDToTarg_post(transition) ACTION_DriveToTarget; end

tClose_pre check if falseTarget is 0 as seen here:

function [fire, transition] = tClose_pre(transition) b5 = get_place('p6'); global global_info; if (b5.tokens && global_info.falseTarget == 0) fire = 1; else fire = 0; end

CHAPTER 3.

METHOD AND DESIGN

19

If tClose red then tClose_post will call ACTION_CloseClaw.m as shown here:

function [] = tClose_post(transition) pause(5); disp('Target aquired!! These ARE the droids we are looking for'); ACTION_CloseClaw; end

After the robot has picked up the search object tRot180 res and tRot_post calls ACTION_Rotate180.m as shown here, which makes the robot rotate 180 degrees

function [] = tRot180_post(transition) ACTION_Rotate180; end

tBorder res and in tBorder_post ACTION_FindBorder.m is called, which uses the light sensor to nd the border and will make the robot stop before driving outside the search area.

function [] = tBorder_post(transition) ACTION_FindBorder; end

After the border is found tOpen will re and in tOpen_post ACTION_OpenClaw.m is called, as seen here, which opens the claw and releases the search object

CHAPTER 3.

METHOD AND DESIGN

20

function [] = tOpen_post(transition) ACTION_OpenClaw; end

The tReverse res and calls ACTION_Reverse.m as seen here, and the robot reverses

function [] = tReverse_post(transition) ACTION_Reverse; end

Then the robot rotates when tRot1802 calls ACTION_Rotate180.m as seen here

function [] = tRot1802_post(transition) ACTION_Rotate180; end

Then it drives forward a bit when tForward calls ACTION_Forward.m when ring as seen here

function [] = tForward_post(transition) ACTION_Forward; end

CHAPTER 3.

METHOD AND DESIGN

21

tBuer is just a transition we added after we could not run without it for some unknown reason

function [fire, transition] = tBuffer_pre(transition) b12 = get_place('p13'); fire = b12.tokens;

tDone works as a sink and will re when p18 have 5 tokens, which is when 5 objects have been found and placed outside the search area. A typical run is shown in Example run.

Module 2:
The robot detects an object, but it has be detected on either edge of the sensor while in the scan sequence. The robot will therefore position itself towards

where it thinks the ball is, scan again, position itself towards the ball according to the new scan, then move into module 1. This module was created in order to make sure that the robot correctly nds object. While testing, we found that without this module the robot often missed the search object and sometimes bumped it out of the search area.

Figure 3.4: Petri Net graph of Module 2

CHAPTER 3.

METHOD AND DESIGN

22

Module 2 event sequence:


1. Object detected in edge of the sensor in the scan sequence 2. Redirects itself towards the detected object 3. Scans again 4. Position itself towards of the detected object 5. Drives to object 6. Continues like in Module 1: Note: At point 3 if the sensor does not detect anything will lead to the program will stop, but while testing and simulating this was not a problem. The reason for this module is to recalculate the rotation angle in order for the robot to more accurately drive to the object it already found.

Figure 3.5: Sensor read when ball is on the edge

Object detected on the edge of the sensor: As seen in gure 3.1.3; the object was found in the beginning of the scan. When the robot completes the sequence it will believe that the center of the search object is located in the where the red point is marked. Since the robot might not have scanned the entire search object, this might be inaccurate. It will therefore rotate towards the marked

point, then do another 90 degree scan. This is to ensure that the robot scans the entire search object and get and more accurate reading. The same will happen if the sensor will scan the closes object in the end of the 90 degree scan.

CHAPTER 3.

METHOD AND DESIGN

23

Figure 3.6: Ball on edge

Figure 3.6 is an illustration of when the robot scans and don't scan the whole search object by it being on the edge of the scan. The same will happen if the search object will be scanned on the other edge.

What happens in the transitions in Module 2:


In the end of ACTION_ScanDistance.m it will detect if the object is found in the beginning or the end of the scan as shown here:

% check if target is at edge of scan-area. if distance(1) > 0 || distance(2) > 0 || distance(37) > 0 || distance(38) > 0 global_info.edgeDetected = 1 else global_info.edgeDetected = 0 end

When there is a token at p5, tEdge is enabled. tEdge_pre.m will check that edgeDetect is 1 and will re as shown here:

CHAPTER 3.

METHOD AND DESIGN

24

function [fire, transition] = tEdge_pre(transition) global global_info; b5 = get_place('p5'); if b5.tokens && global_info.edgeDetected == 1 fire = 1; else fire = 0; end

tScan3 will re and tScan3_post calls ACTION_ScanDistance.m as seen here, which makes the robot scan again to recalculate the position of the search object:

function [] = tScan3_post(transition) ACTION_ScanDistance; end

When tToTarg res tToTarg_post.m will call ACTION_RotateToTarget.m as seen here, making the robot turn to the search object

function [] = tToTarg_post(transition) ACTION_RotateToTarget; end

The robot will now continue like in Module 1:

CHAPTER 3.

METHOD AND DESIGN

25

Module 3:
No object detected: All points scanned are over 100 cm; this means that there are no objects in the current scan, and the robot will continue to search towards the left. If the robot has search an entire 360 degree rotation, but has still not found any objects, the robot will move a certain distance forward, then start scanning again.

Figure 3.7: Petri Net Graph of Module 3

Module 3 event sequence:


1. Scans 2. Nothing detected 3. Continue scan to the left another 90 degree 4. Repeat 3. until scanned 360 degrees 5. If nothing is found after scanning 360 degrees; drive forward, if something is found; continue like in Module 1: 6. Back to 1

In gure 3.7 after tScan1 res, P3 will receive a token.

If an object is found

tTarget will re and the token from P3 will be placed in P4, but if nothing is detected tNoTarg will re and a token will be placed in P14. A new scan will start (the robot just continues to scan to the left) and a token is placed in P15, if an object is found tTarget2 will re and the token from P15 will be placed in

CHAPTER 3.

METHOD AND DESIGN

26

P4, but if nothing is detected tNoTarget2 will re and a token will be placed in P14 and P16. It has now scanned 180 degrees. This will loop until P16 has three tokens and t360Scanned res or the sensor nds something and tTarget2 res and continues like in Module 1:. If t360Scanned res three tokens will be taken from P16 and one will be placed in P17. After that, tForward2 will re making the robot move forward a bit and a token will be placed in P2 and will continue. tCleanUp1 and tCleanUp2 works as sinks, these will take the excess tokens from P16 if tTarget2 res.

Figure 3.8: Sensor read when nothing is detected

Figure 3.8 shows what the sensor read out looks like when no objects are found.

CHAPTER 3.

METHOD AND DESIGN

27

Figure 3.9: Ball is out of sight

Figure 3.1.3 is an illustration of the search object is outside of the scan of the robot

What happens in the transitions in Module 3:


In the end of ACTION_ScanDistance.m which is called in the tScan1 it will check if there is an object near by as seen here:

% % Confirm target detection and plot destination finalTarget= tachoArea(int8(length(tachoArea)/2)); if distance > 100 global_info.targetAcquired = 0; elseif distance < 100 global_info.targetAcquired = 1; plot(finalTarget/5,distance(finalTarget/5),'-*r'); end

When a token is in p3 tTarget and tNoTarg are enabled, if targetAcquired

CHAPTER 3.

METHOD AND DESIGN

28

is 0 then tNoTarget will re as shown here:

function [fire, transition] = tNoTarget_pre(transition) global global_info; % global targetAcquired; b2 = get_place('p3'); if ((b2.tokens == 1) && (global_info.targetAcquired == 0)) fire = 1; else fire = 0; end

When tScan2 res it calls ACTION_ScanDistance2.m in tScan_post.m as shown here, this will make the robot scan another 90 degrees to the left

function [] = tScan2_post(transition) ACTION_ScanDistance2; end

ACTION_ScanDistance2.m works like ACTION_ScanDistance.m and will also have an targetAcquired like shown above. After tScan2 has red a token is placed in p16 which enables tTarget2 and tNoTarget2, if targetAcquired is 1 tTarget2 will re and it will continue in Module 1:, however if targetAcquired is 0 tNoTarget2 will re putting a token in p16 and one in p14 as seen here:

function [fire, transition] = tNoTarget2_pre(transition) global global_info;

CHAPTER 3.

METHOD AND DESIGN

29

% global targetAcquired; b14 = get_place('p15'); if ((b14.tokens == 1) && (global_info.targetAcquired == 0)) fire = 1; else fire = 0; end

When a token is placed in p14 tScan2 will be enabled again and make it scan again. If target is acquired tTarget2 will re and it will continue in Module 1:, and tCleanUp1 will also re, sinking the excessive token. If

no target is acquired tNoTarget 2 will re again putting another token in p16. This will continue until a target is acquired or t360Scanned is

enabled when 3 tokens is at p16 and one token is at p14 as shown here:

function [fire, transition] = t360Scanned_pre(transition) display('WE ARE IN T360SCANNED_POST'); b15 = get_place('p16'); b13 = get_place('p14'); fire = (b15.tokens == 3) && b13.tokens;

tForward2 will now be enabled and when ring it calls ACTION_Forward.m as seen here:

function [] = tForward_post(transition) ACTION_Forward; end

After driving forward it will continue like in Module 1:

CHAPTER 3.

METHOD AND DESIGN

30

Module 4:
Object detected outside search area (false target): While scanning, the robot might detect an object it have already removed from the search area. When this happens, the robot will drive towards the object like in Module 1:, but when the search area border is detected with the light sensor, the robot will stop rotate 180 degrees and search again.

Figure 3.10: Petri Net of Module 4

Module 4 event sequence:


1. The NXT starts to scan 2. Finds an object 3. Positions itself towards the object 4. Drives to the object 5. Border is detected 6. Stops 7. Turns 180 degrees 8. Drives forward 9. The NXT robot starts to scans again

CHAPTER 3.

METHOD AND DESIGN

31

Note: While testing we experienced that the robot got stuck when we made the search area in various shapes. It might get stuck in an innite loop if two search objects are on each side of a corner and the robot is searching between that corner. It will end up with scanning, nding an object, being stopped by the border, turn around, scan again and nd the other object and continue like this until the battery dies. To make sure this does not happen, the search area should be as circular as possible.

Figure 3.11: Object detected outside search area

Figure 3.11 is an illustration of the NXT robot when it nds an object outside the search area. The search area is a black line and the robot will detect it when driving towards the search object with the light sensor, which is pointing to the oor.

What happens in the transitions in Module 4:

In the end of ACTION_DriveToTarget.m which is called in tDToTarget in will check the light sensor if it nds the border of the search area as seen here:

OpenLight(SENSOR_2, 'ACTIVE');

CHAPTER 3.

METHOD AND DESIGN

32

light = GetLight(SENSOR_2);

if light > 350 global_info.falseTarget = 1; end CloseSensor(SENSOR_2); end end disp('falseTarget'); global_info.falseTarget

When a token is placed in p6 tClose and tFalseTarget is enabled, if the falseTarget is 0, tClose will re and continue like in Module 1:, however if falseTarget is 1 then tFalseTarget will re as shown here:

function [fire, transition] = tFalseTarg_pre(transition) global global_info; b5 = get_place('p6'); if (b5.tokens && global_info.falseTarget == 1) fire = 1; else fire = 0; end

Then tRot1803 res and calls ACTION_Rotate180.m in tRot1803_post.m as shown here, making the robot rotate 180 degrees

function [] = tRot1803_post(transition) ACTION_rotate180; end

CHAPTER 3.

METHOD AND DESIGN

33

Then tForward2 res and calls ACTION_Forward.m in tForward2_post as shown here, making the robot drive forward a bit

function [] = tForward_post(transition) ACTION_Forward; end

After driving forward it will continue like in Module 1:

CHAPTER 3.

METHOD AND DESIGN

34

Complete search routine


When all 4 modules are combined, we get the complete Petri NET:

Figure 3.12: Complete Petri Net graph

Chapter 4
Testing, analyzing and result
In our testing we tried dierent search objects; like cans, balls and building small square pieces with Lego. We found that the balls worked the best for the ultrasonic sensor. In this chapter we will look at the dierent problems and

bugs we found while testing and programming.

4.1 Example run


To give the reader a good overview of the systems functionality, a step-by-step example run will be presented below.

35

CHAPTER 4.

TESTING, ANALYZING AND RESULT

36

Figure 4.1: Here we assume that the bluetooth connection has been established and that the software has been started according to the user manual. The user will here push the start button and the routine will begin. The rst action the robot will do is a 90 degree scan, illustrated by the blue eld in front of the robot.

CHAPTER 4.

TESTING, ANALYZING AND RESULT

37

Figure 4.2: The robot will here detect the ball and, position itself towards it.

CHAPTER 4.

TESTING, ANALYZING AND RESULT

38

Figure 4.3: Then the ball will drive towards to ball stopping directly in front of it. It will then close its claw, securing the ball

CHAPTER 4.

TESTING, ANALYZING AND RESULT

39

Figure 4.4: After the robot has secured the ball it will turn 180 degrees.

CHAPTER 4.

TESTING, ANALYZING AND RESULT

40

Figure 4.5: Once the robot has repositioned itself it will drive until it reaches the border.

CHAPTER 4.

TESTING, ANALYZING AND RESULT

41

Figure 4.6: As the robot reaches the border, the light sensor will detect the change in color and the robot will stop. When the robot has stopped the claw will open to release the ball.

CHAPTER 4.

TESTING, ANALYZING AND RESULT

42

Figure 4.7: Once the ball is released, the robot will drive backwards, reentering the search area.

CHAPTER 4.

TESTING, ANALYZING AND RESULT

43

Figure 4.8: The robot will then rotate 180 degrees so that it is pointing towards the center of the area. length. After it has rotated, it will drive forward by a xed

CHAPTER 4.

TESTING, ANALYZING AND RESULT

44

CHAPTER 4.

TESTING, ANALYZING AND RESULT

45

Figure 4.9: When the robot has stopped, it will scan again. In this case there are no balls in the search area and it will continue search in 90 degree intervals until it has searched a full 360 degrees.

CHAPTER 4.

TESTING, ANALYZING AND RESULT

46

Figure 4.10: When the robot has scanned 360 degrees around one point without detecting any balls it will drive forward.

CHAPTER 4.

TESTING, ANALYZING AND RESULT

47

Figure 4.11: The robot will then scan a new 90 degree area. Again the robot will fail to detect any balls any will continue scanning another 90 degrees.

CHAPTER 4.

TESTING, ANALYZING AND RESULT

48

Figure 4.12: On the next scan it will detect a ball, however it is on the edge of the search area.

CHAPTER 4.

TESTING, ANALYZING AND RESULT

49

Figure 4.13: The robot will therefor position itself towards where it thinks the ball is an scan again. This is to ensure that it positioned correctly towards the ball.

CHAPTER 4.

TESTING, ANALYZING AND RESULT

50

Figure 4.14: As earlier the robot will now position itself towards the ball, and drive towards it.

CHAPTER 4.

TESTING, ANALYZING AND RESULT

51

Figure 4.15: Once it has reached the ball, it will grab it and rotate 180 degrees.

CHAPTER 4.

TESTING, ANALYZING AND RESULT

52

Figure 4.16: The robot will then drive forward until it reaches the border. Let go of the ball, and reverse as shown earlier.

CHAPTER 4.

TESTING, ANALYZING AND RESULT

53

Figure 4.17: It will again rotate 180 degrees so that it is pointing towards the center of the search area.

CHAPTER 4.

TESTING, ANALYZING AND RESULT

54

CHAPTER 4.

TESTING, ANALYZING AND RESULT

55

Figure 4.18: Again it will start to scan. This time, the robot will have to scan 360 degrees until it nds the last ball.

CHAPTER 4.

TESTING, ANALYZING AND RESULT

56

Figure 4.19: Once the ball is detected it will position itself towards it, and drive forward.

CHAPTER 4.

TESTING, ANALYZING AND RESULT

57

Figure 4.20: When the robot stops, it will close the claw to secure the ball.

CHAPTER 4.

TESTING, ANALYZING AND RESULT

58

Figure 4.21: After the ball has been secured, it will rotate 180 degrees. And drive towards the border. Once the robot has reached to border and release the ball, the program is complete.

CHAPTER 4.

TESTING, ANALYZING AND RESULT

59

Below is a table which show estimate times the robot uses to perform dierent actions. The times here are found during a test where the ball was 20 cm

away from the robot during the scan sequence, and it had to travel 1m to reach the border. The times here are given as a rough estimate. Exact times are not possible to acquire for all actions due to the random nature of the balls locations in the search area. Action performed Normal Scan Reposition towards ball drive to ball (20cm) close claw Drive to border (1m) open claw Drive backwards Rotate 180 degrees Drive forward Time taken 29s 5s 7s 4s 6s 4s 2s 5s 5s

A short video was created that shows the search function of the robot which will be available if reader access to complete zip le

4.2 Limitations and challenges


- Bluetooth
Connecting the NXT to MATLAB using bluetooth takes a lot of time, the NXT will only connect to certain types of bluetooth radios or external bluetooth dongles.

- NXT limitations
The NXT cannot send and receive signals at the same time. Transferring large amount of data is something the NXT cannot handle. One challenge was if the Clean-up robot detected one of the object it had already cleaned up and placed outside the search area. The NXT could not continuously scan to check if it

was within the search area using the light sensor, while scanning for objects using the ultrasonic sensors. This makes the scanning routine very slow. The NXT has only 130 kB storage space, which might not be a big problem since

CHAPTER 4.

TESTING, ANALYZING AND RESULT

60

programs use little storage space, but if desired to have custom sound on the NXT the storage space will be used up with only a few les.

CHAPTER 4.

TESTING, ANALYZING AND RESULT

61

- Lego bricks
We were given a building set with only enough pieces to build the dierent robots from that the set. The robots from that set could not preform the tasks we wanted, they had no grabbing mechanism. The robot we created is not

completely symmetrical in its looks since we did not have enough pieces. There was not enough pieces to build the claw like we wanted; had a desire to build a claw that would lift up the search object when grabbing it.

- Servo motors
Using two individual and unique motors, the do not rotate at the same speed. One motor always start a short time before the other, causing the robot to turn a bit when we program it to go straight forward. Sometimes it has problems with very small rotations. This may cause the robot to miss its intended target.

- Sensors
The ultrasonic sensor has sometimes hard time sensing when it comes to dierent object, for example if you are creating an obstacle avoidance program using the ultrasonic sensor as detection sensor it can avoid a square item which is standing perpendicular to the ultrasonic sensor, but if the nds the object in a dierent angle, then it gets inaccurate reading which may result in the robot bumping into the items.

- Battery and recharging


The NXT is battery driven, and it needs to be charged time to time. When the battery is not fully charged, the robot sometimes have a hard time making small turn or it works slower.

4.3 Analyzing
By doing test-runs with the robot we have seen that the program itself runs correctly through all stages of the petri-net. However the inaccuracy of the

hardware components on the robot will sometimes lead to failures in the programs functionality. For instance if the robot detects a ball and is to drive

forward to the ball, it might not drive straight enough to be able to position

CHAPTER 4.

TESTING, ANALYZING AND RESULT

62

itself correctly. This may result in a failure to grab the search object, and even though it fails to grab the search object the NXT will continue without removing the search object from the search area. If program is set to nd a specic amount of search objects, this will result in a search object is left in the search area. When the robot has found an object, grabbed it and is on its way to put it outside the search area, if there are some objects in the way, the robot will simply drive straight towards it and hitting it. While testing No module was for created a scenario where two search objects were place right next to each other. If this scenario would happen then the program created

would just read the two objects as one big object and drive in the middle of them, picking up both or none. While testing, the search objects would always be placed apart from each other so this would not happen.

4.4 Result
The clean-up robot got completed and works like it is suppose to, but with the limitations and challenges mentioned. The routine is fully autonomous like we wanted it to be. In ideal conditions (set search environment, search objects

placed apart and good lighting) the robot can run by its self. With the limited resources and time we are satised with the outcome. We have reached the

conclusion that using a model and simulation approach to program a robot in this intended level of autonomy will be too poor compared to other types of programming. In this report we included a building instruction on how to build the exact model we created found in Building guide, a full user guide on how to run the program we created with the robot model found in User manual for Lego NXT Clean-up robot and installation guide on what programs are needed to run the program created found in C.

Chapter 5
Discussion
Discussion of the limitation and the challenges and possible solutions.

5.1 Connection
Since bluetooth is the only wireless connection available on the NXT kit we are bound to use it if we don't want a long USB cable hanging from it. Bluetooth makes it dicult to send and receive data at the same time. The NXT brick also has a 30ms latency when it switches from receiving data to sending data which means that the expected latency for a sensor reading request is 60 ms. If this was on an industrial level, it might be a problem and even dangerous. On [14] it is recommended to use USB for real time programming, but then we would need a long USB cable and the robot might detect the cable as an obstacle. A WiFi connection would be more preferable, since it process more data and also easier to connect..

5.2 Petri Net


Using Petri Net to program a robot is workable, but not ideal. Many times while working with this project we felt that Petri Net programming limited us too much. Since we wanted to program an autonomous robot it is hard using Petri Net since its execution is non-deterministic: when multiple transitions are enabled at the same time, any of the enabled transitions may re (but don't

63

CHAPTER 5.

DISCUSSION

64

have to). Petri nets are more suited for concurrent programming, which in this case it is not. Creating a program directly in MATLAB would be easier and possibly better.

5.3 Further work


There are several things that can be added to this project.

Speed

The robot has a slow search routine since its calculations are slow and the NXT cannot process large amounts of data at the same time. This can be

compensated by adding more robots to the search area, this can be controlled using swarm intelligence[9], where the robots share their current position so each robot searches a distinct part of the search area, and avoids colliding with each other. This will severely decrease the time it takes to clear the area of objects.

Increased accuracy

One of the most important things that this robot needs is an increase in accuracy. The robot needs mainly two things to increase its performance, increase in measurements from the sonar, so that the robot will stop at the correct distance from the ball. And increase the motorcontrol so that the robot can drive straighter than it currently does, this is to ensure that the robot actually hits its designated target. This can be achieved by scanning for the ball several times as the robot is advancing towards it. This will however severely increase the

time the robot uses to clear objects from the search area. Another method that could be used is to add use a camera, either on the robot or externally with a overhead view and using object detection[12] algorithms. Some of these methods are covered in [15], but here the program was not created with GpenSIM, but directly in MATLAB.

Color detection

The robot can be tted with a color sensor, so that it can remove balls with a certain color.

CHAPTER 5.

DISCUSSION

65

Obstacle detection

For the robot to be fully autonomous, it would need an obstacle detection sensor. All 4 sensor ports were used in this robot, but by adding an additional NXT programmable brick, you will get another 4 sensor ports.

Communication

As mentioned earlier in this report we had severe problems with the bluetooth connection. Improving this connection or nding alternative forms of communication would make the robot a lot more user friendly.

Tracking

It would be ideal if the robot could remember where it has searched before so it wouldn't have to search there again, which would increase the speed of the routine. And it also would be better if the robot could place all the search

object in one place.

New claw design

When building the robot a dierent claw design was desired; one where the claw would lift up the object the robot found. We would also like the claw to be bigger, so accuracy will be better.

Chapter 6
Image Reference

Figures 3.2, 3.1.3 and 3.8 are graphs showing the ultrasonic sensor readouts that were sent to MATLAB

Figures 3.12, 3.1, 3.1.3, 3.7 and 3.10 are drawn using PIPE2 Figures 2.3, 2.4, 2.2 and 2.5 found at

http://www.brickset.com/browse/themes/?theme=Mindstorms&subtheme=NXT

Figures 3.3, 3.6, 3.1.3 and 3.11 the Lego model was created using Lego Digital Designer

2 and edited with Paint.net3

1 PIPE2 2 LEGO

is an open source, platform independent tool for creating and analyzing Petri nets

including Generalized Stochastic Petri nets. Digital Designer, or LDD, is a free computer program produced by the LEGO

Group as a part of LEGO Design byME. The program allows users to build models using virtual LEGO bricks, in a computer-aided design like manner. Found here: http://ldd.lego.com/

3 Paint.NET

is free image and photo editing software for computers that run Windows.

Found here: http://www.getpaint.net/

66

Bibliography
[1] Reggie Davidrajuh.

GPenSIM: A New Petri Net Simulator, Petri Nets Applications, Pawel Pawlewski (Ed.), ISBN: 978-953-307-047-6. InTech,
2010. The support of autonomy and the Vol

[2] Richard M. Deci, Edward L.; Ryan. control of behavior.

Journal of Personality and Social Psychology,

53(6):10241037, Dec 1987. [3] Wikipedia The Free Encyclopedia. "lego mindstorms nxt - sensors".

http://en.wikipedia.org/wiki/Lego_Mindstorms_NXT. [4] Roar Fjellheim. Ai and autonomy in oil and gas, March 2012. NFA

Autonomy in oil and gas industry. [5] Hassane Alla Francois Charbonnier and Ren David. The supervised control of discrete-event dynamic systems. 7 no. 2:1, March 1999. [6] Staan Ekvall; Danica Kragic; Patric Jensfelt;. Object detection and mapping for service robot tasks. 2007. [7] Lego. Lego digital designer. http://ldd.lego.com/. [8] Lego. "nxt" retrieved april 13, 2012 from lego.com.

Robotica,

Volume 25 Issue 2:175187, March

http://mindstorms.lego.com/en-us/whatisnxt/default.aspx. [9] Alcherio Martinoli. Swarm intelligence in autonomous collective robotics: From tools to the analysis and synthesis of distributed control strategies. 1999. [10] MathWorks. Matlab the language of technical computing.

http://www.mathworks.se/products/matlab/. 67

BIBLIOGRAPHY

68

[11] Robert W. Hogg; Arturo L. Rankin; Stergios I. Roumeliotis; Michael C. McHenry; Daniel M. Helmick; Charles E Bergh; Larry Matthies. gorithms and sensors for small robot path following. Al-

Proceedings of the 2002 IEEE International Conference on Robotics 8 Automation, May 2002.
[12] Constantine P. Papageorgiou; Michael Oren; Tomaso Poggio;. framework for object detection. A general

Center for Biological and Computational Learning , Articial Intelligence Laboratory , MIT.
[13] RWTH. Install guide. http://www.mindstorms.rwthaachen.de/trac/wiki/Download#InstallationGuide. [14] RWTH Aachen University. "rwth - mindstorms nxt toolbox for matlab" retrieved february 6, 2012 from:. http://www.mindstorms.rwth-aachen.de/. [15] Marco Casini; Andrea Garulli; Antonio Giannitrapani; Antonio Vicino. A matlab-based remote lab for multi-robot experiments.

Appendix A
Building guide
See additional le Building Instructions [claw].html

69

Appendix B
User manual for Lego NXT Clean-up robot
B.1 Introduction
This is a small user manual for the Lego NXT Clean-up robot. If you have not read the installation guide, do it now. The Clean-up robot is extremely easy to use once you have been able to get the bluetooth connection up and running.

B.2 The search area


The rst thing you need to do is to create your search area and place out the balls that come with the NXT kit. The search area should have a white border made from, for instance white A4 papers. It is important that the search area is placed on a dark colored oor and that the border is white, so that the robot is able to distinguish between the search area and the border. We recommend an area with a diameter of roughly 1.5 to 3 meters. Once the search area is ready place as many balls as you want within its borders, preferably minimum 30cm apart, this is so that the robot does not confuse two or more balls for one ball. It is also advised that the area is built where there are few objects close to the border, since this might lead to the robot nding objects outside of the search area, resulting in a prolonged search time.

70

APPENDIX B.

USER MANUAL FOR LEGO NXT CLEAN-UP ROBOT

71

Figure B.1: Search Area

APPENDIX B.

USER MANUAL FOR LEGO NXT CLEAN-UP ROBOT

72

B.3 Adjusting and starting the program


Once the search area is ready you need to open MATLAB and go to the folder containing the program. If you have recently gone through the installation

manual you should have the robot connected to the computer over bluetooth, if not run the connect.m le, make sure that the motorcontroller program is running on the NXT. Open the NXT_ClawBot_def.m and change the variable numberOfBalls to the number of balls that you have on your search area, then place your robot near the border of the search area, facing inwards. Run the les NXT_ClawBot.m, you should hear a beep from the robot to conrm that the program has started. When you are ready press the push button on the robot and stand back. The robot will now start to remove the balls from the search area and it will stop when all the balls have been removed.

Appendix C
Installation guide for Lego NXT Clean-up robot
Building instructions for the robot can be found in Appendix 1B and Appendix 2B. There you will nd a HTML version and a Lego digital designer le, in the latter you can view the build instructions while being able to rotate the model. To be able to use this, you have to download and install Lego digital designer, found at [7].

C.1 Installing software


There are numerous software and drivers needed to be able to run the robot from a computer.

C.2 RWTH Mindstorms NXT Toolbox


A link to the download site has been given in [13]. Download the newest, stable version. This project used version 4.7, however if a newer, stable versions have been released this should work as well. Once the toolbox has been downloaded, nd the readme.txt le and follow the instructions carefully. Some things are especially important, checking that the NXT has the correct rmware version, checking that the bluetooth adapter is compatible and when you create the bluetooth.ini le, that you are sure you have the correct COM port or MAC 73

APPENDIX C. INSTALLATION GUIDE FOR LEGO NXT CLEAN-UP ROBOT74

address. adapter.

Also make sure that you have updated the driver for the bluetooth

C.3 Installing the program


The program for the Clean-Up bot needs to be installed on your computer. To do this, follow these steps.

1. Place the program les in a suitable folder. 2. In MATLAB, go to File -> Set Path and select the folder where you placed the les. Then click Add with sub folder.

C.4 Setting up the bluetooth.ini le


Once you have everything correctly installed, you have to set up the bluetooth.ini le, you can either do this manually by looking at the example les that came with the toolbox, or use the script as explained in the readme le. Remember that on windows 7, you have to use MAC address and on other windows version you have to use COM ports.

COM ports.

To nd your com port, you can double click the bluetooth icon on the right part of your task bar. The click the COM Ports tab. Here you should see two COM ports listed, one for outgoing and one for incoming. The bluetooth.ini le needs to have the outgoing com port addressed to it. If there are only one COM port listed, you have to add the second one, simply press add and select the NXT.

MAC address

To nd the correct MAC address, you need to press start -> run, type in cmd. In the window that opens type in ipcong /all. On the following information look for physical address There should be 12 letters and numbers following this line, that is your mac address. Example: Physical address . . . . . . 00-0F-FE-54-37-C1

APPENDIX C. INSTALLATION GUIDE FOR LEGO NXT CLEAN-UP ROBOT75

C.5 Connecting to the NXT


Once you have the bluetooth.ini le, you can move it to the same folder as the program. When you have done that you can go to your NXT, navigate to the bluetooth menu and select search. The NXT should now search for your computers bluetooth. Once the NXT has found the bluetooth, select it and

conrm the passkey 1234. There should now pop up a bluetooth notication on your computer, enter the same passkey and hopefully you have been able to connect the NXT to your computer through bluetooth. If this fails, you can try to go the other way, by searching for bluetooth units on your computer. To do this double click the bluetooth icon on the right on your task bar and select add. Make sure that the NXTs bluetooth is turned on, check the box that says My device is set up and ready to be found and press next. Once you nd the NXT, double click it, you will now get a similar passkey on the computer and the NXT, enter 1234 as before. When you have done this, double check the COM-ports and mac address, if they have changed, change them accordingly in the bluetooth.ini le. Once you have done all this you can go on the NXT to Bluetooth and then to My Contacts. The computer should appear here now. Run the connect.m le in MATLAB and select the computer on the NXT, press connect on the NXT and select 1. If everything was successful you should now have connected the NXT to the computer. One way to verify that you have been able to connect, is to check that the

on the NXT.

motorcontroller program has started

C.6 Troubleshooting the bluetooth connection


Unfortunately the bluetooth connection is very hard to get stable, if you cannot connect the devices together there are several things that you can try.

Keep trying to connect as explained before, also rebooting the NXT once in a while.

Try pairing the bluetooth and the computer all over again, remember to check MAC address / COM ports.

Try a dierent computer.

APPENDIX C. INSTALLATION GUIDE FOR LEGO NXT CLEAN-UP ROBOT76

It also appears that Mac computers handle bluetooth much better than windows, if you have one available I highly suggest you try to use it rst

Appendix D
MATLAB code
D.1 Action Commands
The software has a series of les called action commands; these tell the robot what actions it should take. Below is a list of all the les, and what actions the robot will do.

ACTION_CloseClaw.m
This action will rotate the center motor, causing the claw to go from open position to closed position, enabling the robot to grab the spheres lying around the search area. It is important that this command is not sent to the robot while the claw is already closed, this will cause the motor to try to reach a position it is not able to go to.

global portC; global global_info; mClaw mClaw.SpeedRegulation mClaw.Power mClaw.TachoLimit mClaw.ActionAtTachoLimit = = NXTMotor(global_info.portC); % = true; % = 20; % positive = close = 35; 'Brake';
77

APPENDIX D.

MATLAB CODE

78

mClaw.SendToNXT();
ACTION_OpenClaw.m
Similar to ACTION_CloseClaw.m however, the claw will now open.

global portC; global global_info; pause(1) mClaw mClaw.SpeedRegulation mClaw.Power mClaw.TachoLimit mClaw.ActionAtTachoLimit = mClaw.SendToNXT();

= NXTMotor(global_info.portC); % = true; % = -20; % positive = close = 35; 'Brake';

ACTION_DriveToTarget.m
The DriveToTarget action will drive forward towards the search object so that it is within reachable distance for the claws. The robot will stop at given distances and scan the light level of the surface it is currently on, this is to ensure that the robot does not exit the search area. calculated in the scanDistance actions. The distance it will drive has been

%drive to target global global_info; global MOTOR_A global portM global finalTarget global distance global falseTarget falseTarget = 0; drivenDistance = 0; targetReached = 0; while targetReached == 0 && falseTarget == 0;

APPENDIX D.

MATLAB CODE

79

if (distance(finalTarget/5)*18) - drivenDistance > 360 mTurn1 = NXTMotor(global_info.portM); % ports swapped because it's nicer mTurn1.SpeedRegulation = false; % we could use it if we wanted mTurn1.Power = 80; mTurn1.TachoLimit = 360; mTurn1.ActionAtTachoLimit = 'Brake'; mTurn1.SendToNXT(); pause(2) drivenDistance = drivenDistance + 360; OpenLight(SENSOR_2, 'ACTIVE'); light = GetLight(SENSOR_2)

if light > 350 targetReached = 1; global_info.falseTarget = 1; end CloseSensor(SENSOR_2); elseif(((distance(finalTarget/5)*18) - drivenDistance <= 360) && falseTarget == 0) mTurn1 = NXTMotor(global_info.portM); % ports swapped because it's nicer mTurn1.SpeedRegulation = false; % we could use it if we wanted mTurn1.Power = 80; mTurn1.TachoLimit = ((distance(finalTarget/5)*18) - drivenDistance+5); %mTurn1.ActionAtTachoLimit = 'Brake'; mTurn1.SendToNXT(); pause(2) targetReached = 1; OpenLight(SENSOR_2, 'ACTIVE'); light = GetLight(SENSOR_2);

if light > 350

APPENDIX D.

MATLAB CODE

80

global_info.falseTarget = 1; end CloseSensor(SENSOR_2); end end disp('falseTarget'); global_info.falseTarget


ACTION_FindBorder.m
This action will be sent to the robot after it has obtained an item. The action will cause the robot to drive forward until it reaches the border of the search area, so that it can release the ball, hence, removing it from the search area. Again the robot will stop at given distances and scan the light level so that it can detect when it has reached the border.

%drive forward OpenLight(SENSOR_2, 'ACTIVE'); global global_info; global portM

trigger = 1; while trigger; light = GetLight(SENSOR_2); if light < 350; mTurn1 mTurn1.SpeedRegulation mTurn1.Power mTurn1.TachoLimit mTurn1.ActionAtTachoLimit = mTurn1.SendToNXT(); elseif light > 350;

= NXTMotor(global_info.portM); % ports swapped because i = false; % we could use it if we wanted = 80; = 0; 'Brake';

APPENDIX D.

MATLAB CODE

81

mTurn1 mTurn1.SpeedRegulation mTurn1.Power mTurn1.TachoLimit mTurn1.ActionAtTachoLimit = mTurn1.SendToNXT(); trigger = 0; end end; CloseSensor(SENSOR_2);
ACTION_Forward.m

= NXTMotor(global_info.portM); % ports swapped because i = false; % we could use it if we wanted = 80; = 1; 'Brake';

When given this command, the robot will simply drive forward a set distance. This is typically used after it has completed a full 360 degree scan without nding any objects.

global global global global

global_info; portM; finalTarget; distance; = NXTMotor(global_info.portM); = false; = 80; = 300*3; 'Brake';

mTurn1 mTurn1.SpeedRegulation mTurn1.Power mTurn1.TachoLimit mTurn1.ActionAtTachoLimit = mTurn1.SendToNXT(); % mTurn1.WaitFor(); pause(2)


ACTION_Reverse.m

This action is similar to ACTION_Forward but will go backwards instead of forward. This is used after the robot has removed a ball from the search area,

APPENDIX D.

MATLAB CODE

82

so that the ball is clear of the claw.

global global global global

global_info; portM finalTarget distance

mTurn1 mTurn1.SpeedRegulation mTurn1.Power mTurn1.TachoLimit mTurn1.ActionAtTachoLimit = mTurn1.SendToNXT();


ACTION_Rotate180.m

= NXTMotor(global_info.portM); % ports swapped because it's nice = false; % we could use it if we wanted = -80; = 600; 'Brake';

When the ACTION_Rotate180 command is sent to the robot it will rotate 180 degrees. This is typically used after the robot has removed a ball from the

search area and backed away from the ball. This will position the robot so that it is facing towards the center of the search area.

global global_info % Turn 90 degress to right pause(1) mTurn1 mTurn1.SpeedRegulation mTurn1.Power mTurn1.TachoLimit mTurn1.ActionAtTachoLimit = mTurn1.SendToNXT(); mTurn1 mTurn1.SpeedRegulation

= NXTMotor(global_info.portM(1)); % ports swapped because it's n = false; % we could use it if we wanted = -40; = 376; 'Brake';

= NXTMotor(global_info.portM(2)); % ports swapped because it's n = false; % we could use it if we wanted

APPENDIX D.

MATLAB CODE

83

mTurn1.Power = 40; mTurn1.TachoLimit =376; mTurn1.ActionAtTachoLimit = 'Brake'; mTurn1.SendToNXT(); pause(3)


ACTION_RotateToTarget.m
This action will be used after one of the ScanDistance actions has found an object. During the ScanDistance actions the program will know how far the

robot has rotated when each distance measurement has been taken. By doing this, we know how far the robot has to rotate to be able to face to object.

%turn to destination global global global global global_info; tachoLength; finalTarget; distance;

pause(1) mTurn1 mTurn1.SpeedRegulation mTurn1.Power mTurn1.TachoLimit mTurn1.ActionAtTachoLimit = mTurn1.SendToNXT();

= NXTMotor(global_info.portM(1)); % ports swapped because it's n = false; % we could use it if we wanted = -30; = (max(tachoLength)-finalTarget)+10; 'Brake';

mTurn1 = NXTMotor(global_info.portM(2)); % ports swapped because it's n mTurn1.SpeedRegulation = false; % we could use it if we wanted mTurn1.Power = 30; mTurn1.TachoLimit = (max(tachoLength)-finalTarget)+10; mTurn1.ActionAtTachoLimit = 'Brake'; mTurn1.SendToNXT(); pause(3) %check distance to target. OpenUltrasonic(global_info.portUS, 'snapshot');

APPENDIX D.

MATLAB CODE

84

distanceCheck = GetUltrasonic(global_info.portUS); CloseSensor(global_info.portUS); %drive forward

ACTION_ScanDistance.m
When this action is sent to the robot it will rst rotate 45 degrees to the right, then rotate 90 degrees to the left. While rotating to the left the robot will stop at certain intervals, scan in front of it, save the distance which is returned from the scan and how far it has rotated at that scan. When it has nished rotating it will check if it has detected any objects.

function [tachoLength,finalTarget] = scanDistance() % portUS = SENSOR_4; % PortM = [MOTOR_A,MOTOR_B]; % PortC = [MOTOR_C]; % portA = [SENSOR_1]; global global global global global global global global_info h; tachoLength; finalTarget; distance; targetAcquired; edgeDetected;

COM_SetDefaultNXT(h); clear figure(1) clf figure(1) hold on grid on

global_info.tachoLength =

zeros(500,1);

APPENDIX D.

MATLAB CODE

85

global_info.distance = zeros(500,1); n = 8; % bytes the US sensor received count = 38; % how many readings until end? plotcols = 8; % how many out of n echos to plot? outOfRange = 160; % setting for out of range readings targetAcquired = 0; data = zeros(1, n); allX = (1:count+1)'; edgeDetected = 0; % Turn 45 degress to right pause(1) mTurn1 mTurn1.SpeedRegulation mTurn1.Power mTurn1.TachoLimit mTurn1.ActionAtTachoLimit = mTurn1.SendToNXT(); mTurn1 mTurn1.SpeedRegulation mTurn1.Power mTurn1.TachoLimit mTurn1.ActionAtTachoLimit = mTurn1.SendToNXT(); pause(3) %start sweep and search for i = 1 : count mTurn1 mTurn1.SpeedRegulation mTurn1.Power mTurn1.TachoLimit mTurn1.ActionAtTachoLimit = mTurn1.SendToNXT();

= NXTMotor(global_info.portM(1)); % ports swapped because it's n = false; % we could use it if we wanted = -40; = 94; 'Brake';

= NXTMotor(global_info.portM(2)); % ports swapped because it's n = false; % we could use it if we wanted = 40; = 94; 'Brake';

= NXTMotor(global_info.portM(1)); % ports swapped because it = false; % we could use it if we wanted = 20; = 5; 'Brake';

APPENDIX D.

MATLAB CODE

86

mTurn1 mTurn1.SpeedRegulation mTurn1.Power mTurn1.TachoLimit mTurn1.ActionAtTachoLimit = mTurn1.SendToNXT(); pause(0.8)

= NXTMotor(global_info.portM(2)); % ports swapped because it = false; % we could use it if we wanted = -20; = 5; 'Brake';

if i == 1; tachoLength(i) = 5; else tachoLength(i) = tachoLength(i-1) + 5; end

OpenUltrasonic(global_info.portUS, 'snapshot'); distance(i) = GetUltrasonic(global_info.portUS); CloseSensor(global_info.portUS) plot(i,distance(i),'-*') % end%for

%determine closest target global target target = min(distance(1:i)); %remove other measurements ( above 100cm) for j=1:i; if distance(j) > target +5 || distance(j) < target - 5 distance (j) = 0; elseif distance(j) == 100 distance(j) = 0; end

APPENDIX D.

MATLAB CODE

87

end

v = 1; targetArea = zeros(1); tachoArea = zeros(1); %find area of block for u=1:i; if distance(u) > 0; temp = zeros(length(targetArea),1); temp2 = temp; temp = targetArea; temp2 = tachoArea; targetArea = zeros(length(temp)+1); targetArea = temp; tachoArea = zeros(length(temp2)+1); tachoArea = temp2; clear temp; clear temp2; targetArea(v) = distance(u); tachoArea(v) = tachoLength(u); v = v + 1; end end %Confirm target detection and plot destination finalTarget= tachoArea(int8(length(tachoArea)/2)) if distance > 100 global_info.targetAcquired = 0 elseif distance < 100

APPENDIX D.

MATLAB CODE

88

global_info.targetAcquired = 1 plot(finalTarget/5,distance(finalTarget/5),'-*r'); end % check if target is at edge of scan-area. if distance(1) > 0 || distance(2) > 0 || distance(37) > 0 || distance(38) > 0 global_info.edgeDetected = 1 else global_info.edgeDetected = 0 end
ACTION_ScanDistance2.m
This action is used if the ACTION_ScanDistance.m does not detect any objects. The robot will then rotate another 90 degrees while scanning, saving the distances and how far it has rotated. This action will continue to run until the robot detects an object, or until the robot has rotated 360 degrees.

function [tachoLength,finalTarget] = scanDistance() % portUS = SENSOR_4; % PortM = [MOTOR_A,MOTOR_B]; % PortC = [MOTOR_C]; % portA = [SENSOR_1]; global global global global global global global_info h; tachoLength; finalTarget; distance; targetAcquired;

COM_SetDefaultNXT(h); clear figure(1) clf figure(1) hold on

APPENDIX D.

MATLAB CODE

89

grid on global_info.tachoLength = zeros(500,1); global_info.distance = zeros(500,1); n = 8; % bytes the US sensor received count = 38; % how many readings until end? plotcols = 8; % how many out of n echos to plot? outOfRange = 160; % setting for out of range readings targetAcquired = 0; data = zeros(1, n); allX = (1:count+1)';

%start sweep and search for i = 1 : count mTurn1 mTurn1.SpeedRegulation mTurn1.Power mTurn1.TachoLimit mTurn1.ActionAtTachoLimit = mTurn1.SendToNXT(); mTurn1 mTurn1.SpeedRegulation mTurn1.Power mTurn1.TachoLimit mTurn1.ActionAtTachoLimit = mTurn1.SendToNXT(); pause(0.8)

= NXTMotor(global_info.portM(1)); % ports swapped because it's n = false; % we could use it if we wanted = 20; = 5; 'Brake';

= NXTMotor(global_info.portM(2)); % ports swapped because it's n = false; % we could use it if we wanted = -20; = 5; 'Brake';

if i == 1; tachoLength(i) = 5; else tachoLength(i) = tachoLength(i-1) + 5; end

APPENDIX D.

MATLAB CODE

90

OpenUltrasonic(global_info.portUS, 'snapshot'); distance(i) = GetUltrasonic(global_info.portUS); CloseSensor(global_info.portUS) plot(i,distance(i),'-*') % end%for

%determine closest target global target target = min(distance(1:i)); %remove other measurements ( above 100cm) for j=1:i; if distance(j) > target +5 || distance(j) < target - 5 distance (j) = 0; elseif distance(j) == 100 distance(j) = 0; end end

v = 1; targetArea = zeros(1); tachoArea = zeros(1); %find area of block for u=1:i; if distance(u) > 0; temp = zeros(length(targetArea),1); temp2 = temp; temp = targetArea; temp2 = tachoArea;

APPENDIX D.

MATLAB CODE

91

targetArea = zeros(length(temp)+1); targetArea = temp; tachoArea = zeros(length(temp2)+1); tachoArea = temp2; clear temp; clear temp2; targetArea(v) = distance(u); tachoArea(v) = tachoLength(u); v = v + 1; end end % % Confirm target detection and plot destination finalTarget= tachoArea(int8(length(tachoArea)/2)); if distance > 100 global_info.targetAcquired = 0; elseif distance < 100 global_info.targetAcquired = 1; plot(finalTarget/5,distance(finalTarget/5),'-*r'); end

APPENDIX D.

MATLAB CODE

92

D.2 Main program

NXT_Clawbot.m

global global_info; global_info.REAL_TIME = 1; % This is a Real-Time run global_info.STOP_AT = current_clock(3) + [0 20 0]; % stop after 2 mins global h;

h = initNXT(); png = petrinetgraph({'NXT_test_def'}); dynamicpart.initial_markings = {'p1',1,'p2',0,'p3',0,'p4',0,... 'p5',0,'p6',0,'p7',0,'p8',0,'p9',0,'p10',0,'p11',0,'p12',0,... 'p13',0,'p14',0,'p15',0,'p16',0,'p17',0,'p18',0,'p19',0,... 'p20',0,'p21',0,'p22',0,'p23',0}; dyn.initial_priority = {'tDone',2,'tScan',1,'tCleanup3',3, 'tCleanup2',2,'tCleanup1',1,'tFalseTar',2, 'tClose',1,'tEdge',2,'tDToTarg',1}; % initial priority dynamicpart.firing_times = {'tStart',2,'tScan1',2,'tToTarg',2, 'tDToTarg',2,'tClose',1,'tRot180',2, 'tBorder',2,'tOpen',2,'tReverse',2,... 'tRot1802',1,'tBuffer',1,'tTarget',1, 'tNoTarget',1,'tScan2',1,'tTarget2',1, 'tNoTarget2',1,'t360Scanned',1,'tForward',1, 'tForward2',1,... 'tDone',1,'tFalseTarg',1,'tEdge',1,'tToTarg2',1,'tScan3',1}; display('Push button to start clean-up procedure!'); % dynamicpart.initial_priority = {'tPEDES_START',5, 'tPEDES_CYCLE',5,... % 'tEM_START',5, 'tEM_CYCLE',5};

APPENDIX D.

MATLAB CODE

93

%disp('System is ready ...'); sim = gpensim(png, dynamicpart); % % % close_TL_NXT(); % Never forget to clean up after your work!!! % % print_statespace(Sim_Results); % plotp(Sim_Results, {'p1','p2','p3'});

NXT_Clawbot_def.m

function [png] = NXT_test_def() png.PN_name = 'PDF for nxt test'; png.set_of_Ps = {'p1','p2','p3','p4', 'p5','p6','p7','p8', ... 'p9','p10','p11','p12', 'p13','p14','p15','p16','p17',... 'p18','p19','p20','p21','p22','p23'}; png.set_of_Ts = {'tStart','tScan1','tToTarg', 'tDToTarg',... 'tClose','tRot180','tBorder', 'tOpen','tReverse',... 'tRot1802','tForward','tBuffer','tTarget','tNoTarget',... 'tScan2','tTarget2','tNoTarget2','t360Scanned','tForward2',... 'tDone','tCleanUp1','tCleanUp2','tCleanUp3','tFalseTarg',... 'tRot1803','tForward3','tEdge','tToTarg2','tScan3'};

png.set_of_As = {'p1','tStart',1,'tStart','p2',1,'p2','tScan1',1,'tScan1'... ,'p3',1,'p3','tTarget',1, 'tTarget','p4',1,'p4','tToTarg',1,'tToTarg','p5',1, 'p5','tDToTarg',1,'tDToTarg','p6',1,... 'p6','tClose',1,'tClose','p7',1,'p7','tRot180',1,'tRot180','p8',1,'p8',

APPENDIX D.

MATLAB CODE

94

'tBorder',1,... 'tBorder','p9',1,'p9','tOpen',1,'tOpen','p10',1,'p10','tReverse',1,'tReverse','p11',1,.. 'p11','tRot1802',1,'tRot1802','p12',1,'p12','tForward',1,'tForward','p13',1, 'p13','tBuffer',1,... 'tBuffer','p2',1,... 'p3','tNoTarget',1,'tNoTarget','p14',1,'p14','tScan2',1,'tScan2','p15',1, 'p15','tTarget2',1,... 'tTarget2','p4',1,'p15','tNoTarget2',1,'tNoTarget2','p14',1,'tNoTarget2','p16',1, 'p16','t360Scanned',3,... 'p14','t360Scanned',1,'t360Scanned','p17',1,'p17','tForward2',1,'tForward2','p2',1,... 'tBuffer','p18',1,'p18','tDone',3,'tDone','p19',1,'p2','tDone',1,... 'p16','tCleanUp1',1,'p16','tCleanUp2',2,'p16','tCleanUp3',3,... 'p6','tFalseTarg',1,'tFalseTarg','p20',1,'p20','tRot1803',1,'tRot1803','p21',1,... 'p21','tForward3',1,'tForward3','p2',1,... 'p5','tEdge',1,'tEdge','p22',1,'p22','tScan3',1,'tScan3','p23',1,... 'p23','tToTarg2',1,'tToTarg2','p5',1};

APPENDIX D.

MATLAB CODE

95

D.3 Transitions (in alphabetical order)

t360Scanned_pre.m

function [fire, transition] = t360Scanned_pre(transition) display('WE ARE IN T360SCANNED_POST'); b15 = get_place('p16'); b13 = get_place('p14'); fire = (b15.tokens == 3) && b13.tokens;

t360Scanned_post.m

function [] = t360Scanned_post(transition) end

tBorder_pre.m

function [fire, transition] = tBorder_pre(transition) b7 = get_place('p8'); fire = b7.tokens;

tBorder_post.m

function [] = tBorder_post(transition) ACTION_FindBorder; end

tBuer_pre.m

function [fire, transition] = tBuffer_pre(transition) b12 = get_place('p13'); fire = b12.tokens;

APPENDIX D.

MATLAB CODE

96

tBuer_post.m

function [] = tBuffer_post(transition) end

tCleanUp1_pre.m

function [fire, transition] = tCleanUp1_pre(transition) b3 = get_place('p4'); fire = b3.tokens;

tCleanUp2_pre.m

function [fire, transition] = tCleanUp2_pre(transition) b3 = get_place('p4');

fire = b3.tokens;

tClose_pre.m

function [fire, transition] = tClose_pre(transition) b5 = get_place('p6'); global global_info; if (b5.tokens && global_info.falseTarget == 0) fire = 1; else fire = 0; end

tClose_post.m

function [] = tClose_post(transition) pause(5); disp('Target aquired!! These ARE the droids we are looking for'); ACTION_CloseClaw; end

APPENDIX D.

MATLAB CODE

97

tDToTarget_pre.m

function [fire, transition] = tDToTarg_pre(transition) global global_info; b4 = get_place('p5'); if global_info.edgeDetected == 0 && b4.tokens == 1; fire = 1; end

tDToTarget_post.m

function [] = tDToTarg_post(transition) ACTION_DriveToTarget; end

tEdge_pre.m

function [fire, transition] = tEdge_pre(transition) global global_info; b5 = get_place('p5'); if b5.tokens && global_info.edgeDetected == 1 fire = 1; else fire = 0; end

tEdge_post.m

function [] = tEdge_post(transition) end

APPENDIX D.

MATLAB CODE

98

tFalseTarg_pre.m

function [fire, transition] = tFalseTarg_pre(transition) global global_info; b5 = get_place('p6'); if (b5.tokens && global_info.falseTarget == 1) fire = 1; else fire = 0; end

tFalseTarg_post.m

function [] = tFalseTarg_post(transition) disp('false target aquired! Reinitiate search routine'); end

tForward_pre.m

function [fire, transition] = tForward_pre(transition) global global_info; b11 = get_place('p12'); fire = b11.tokens;

tForward_post.m

function [] = tForward_post(transition) ACTION_Forward; end

APPENDIX D.

MATLAB CODE

99

tForward2_pre.m

function [fire, transition] = tForward_pre(transition) global global_info; b16 = get_place('p17'); fire = b16.tokens;

tForward2_post.m

function [] = tForward_post(transition) ACTION_Forward; end

tForward3_pre.m

function [fire, transition] = tForward3_pre(transition) global global_info; % global targetAcquired; b20 = get_place('p21'); fire = b20.tokens;

tForward3_post.m

function [] = tFoward3_post(transition) ACTION_Forward; end

tNoTarget_pre.m

function [fire, transition] = tNoTarget_pre(transition)

APPENDIX D.

MATLAB CODE

100

global global_info; % global targetAcquired; b2 = get_place('p3'); if ((b2.tokens == 1) && (global_info.targetAcquired == 0)) fire = 1; else fire = 0; end

tNoTarget_post.m

function [] = tNoTarget_post(transition) NXT_PlaySoundFile('r2d2wst2', 0); end

tNoTarget2_pre.m

function [fire, transition] = tNoTarget2_pre(transition) global global_info; % global targetAcquired; b14 = get_place('p15'); if ((b14.tokens == 1) && (global_info.targetAcquired == 0)) fire = 1; else fire = 0; end

APPENDIX D.

MATLAB CODE

101

tNoTarget2_post.m

function [] = tNoTarget2_post(transition) NXT_PlaySoundFile('r2d2wst2', 0); end

tOpen_pre.m

function [fire, transition] = tOpen_pre(transition) b8 = get_place('p9'); fire = b8.tokens;

tOpen_post.m

function [] = tOpen_post(transition) ACTION_OpenClaw; end

tReverse_pre.m

function [fire, transition] = tReverse_pre(transition) b9 = get_place('p10'); fire = b9.tokens;

tReverse_post.m

function [] = tReverse_post(transition) ACTION_Reverse; end

APPENDIX D.

MATLAB CODE

102

tRot180_pre.m

function [fire, transition] = tRot180_pre(transition) b6 = get_place('p7'); fire = b6.tokens;

tRot180_post.m

function [] = tRot180_post(transition) ACTION_Rotate180; end


7

tRot1802_pre.m

function [fire, transition] = tRot1802_pre(transition) b10 = get_place('p11'); fire = b10.tokens;

tRot1802_post.m

function [] = tRot1802_post(transition) ACTION_Rotate180; end

tRot1803_pre.m

function [fire, transition] = tRot1803_pre(transition) global global_info; % global targetAcquired; b19 = get_place('p20'); fire = b19.tokens

APPENDIX D.

MATLAB CODE

103

tRot1803_post.m

function [] = tRot1803_post(transition) ACTION_rotate180; end

tScan1_pre.m

function [fire, transition] = tScan1_pre(transition) b1 = get_place('p2'); fire = b1.tokens;

tScan1_post.m

function [] = tScan1_post(transition) global global_info; ACTION_scanDistance; b2 = 1; end

tScan2_pre.m

function [fire, transition] = tScan2_pre(transition) b13 = get_place('p14'); b15 = get_place('p16'); fire = b13.tokens & b15.tokens < 3;

tScan2_post.m

function [] = tScan2_post(transition) ACTION_ScanDistance2; end

APPENDIX D.

MATLAB CODE

104

tScan3_pre.m

function [fire, transition] = tScan3_pre(transition) b21 = get_place('p22'); fire = b21.tokens;

tScan3_post.m

function [] = tScan3_post(transition) ACTION_ScanDistance; end

tStart_pre.m

function [fire, transition] = tStart_pre(transition) global global_info; OpenSwitch(global_info.portB); button = GetSwitch(global_info.portB); CloseSensor(global_info.portB); if button == 1 fire = 1; else fire = 0; end

tStart_post.m

function [] = tStart_post(transition) NXT_PlaySoundFile('R2D2', 0); end

APPENDIX D.

MATLAB CODE

105

tTarget_pre.m

function [fire, transition] = tTarget_pre(transition) global global_info; % global targetAcquired; b2 = get_place('p3');

if ((b2.tokens & global_info.targetAcquired) == 1) fire = 1; else fire = 0; end

tTarget_post.m

function [] = tTarget_post(transition) end

tTarget2_pre.m

function [fire, transition] = tTarget2_pre(transition) global global_info; % global targetAcquired; b14 = get_place('p15');

display(global_info.targetAcquired); if ((b14.tokens & global_info.targetAcquired) == 1) fire = 1; else fire = 0; end

APPENDIX D.

MATLAB CODE

106

tTarget2_post.m

function [] = tTarget2_post(transition) end

tToTarg_pre.m

function [fire, transition] = tToTarg_pre(transition) b3 = get_place('p4'); fire = b3.tokens;

tToTarg_post.m

function [] = tToTarg_post(transition) ACTION_RotateToTarget; end

tToTarg2_pre.m

function [fire, transition] = tToTarg2_pre(transition) b22 = get_place('p23'); fire = b22.tokens;

tToTarg2_post.m

function [] = tToTarg2_post(transition) ACTION_RotateToTarget; end

You might also like