Professional Documents
Culture Documents
May 2012
Summary (Abstract)
The aim of this project is to create a robot which implements a searching routine to nd objects in an area, removing them from the search area. The search-
ing routine will be fully autonomous, meaning that the robot will nd objects and remove them, making all the decisions and calculations by itself with no interaction from the user. Lego Mindstorms NXT is a programmable robotics kit by LEGO. In this project the Lego Mindstorms NXT 2.0 is used, which is the newest set in LEGO's Lego Mindstorms series. The NXT we are going to design will have 2 wheels on individual motors and will use an ultrasound sensor to nd the objects. The NXT will calculate and nd the object closes to its
current position and drive towards the closes object. We constructed a grabbing mechanism, a claw which uses a third motor to open and close. After the NXT has found and positioned itself in front of the object, it will close the claw and drive to edge of the search area. We will be using a Bluetooth connection to communicate between the NXT and a computer. Petri NET and GePEN sim. This is implemented using
Acknowledgements
We would like to thank our teacher prof. Reggie Davidrajuh for guidance and tips on GpenSim and Stle Freyer for lending us the Lego Kit and helping us when we had questions about the Lego and setting up the connection.
Contents
1 Introduction
1.1 1.2 1.3 Problem denition Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5
5 5 6
2 Background
2.1 2.2 2.3 2.4 2.5 2.6 Lego Mindstorms NXT . . . . . . . . . . . . . . . . . . . . . . . . MATLAB . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . RWTH - Mindstorms NXT Toolbox for MATLAB . . . . . . . .
7
7 10 10 10 11 11
12
13
35
35 59 61 62
5 Discussion
5.1 Connection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
63
63
CONTENTS
5.2 5.3
Petri Net
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
63 64
Further work
6 Image Reference Bibliography A Building guide B User manual for Lego NXT Clean-up robot
B.1 B.2 B.3 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . The search area . . . . . . . . . . . . . . . . . . . . . . . . . . . . Adjusting and starting the program . . . . . . . . . . . . . . . .
66 66 69 70
70 70 72
73
73 73 74 74 75 75
Installing the program . . . . . . . . . . . . . . . . . . . . . . . . Setting up the bluetooth.ini le . . . . . . . . . . . . . . . . . . . Connecting to the NXT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
D MATLAB code
D.1 D.2 D.3 Action Commands . . . . . . . . . . . . . . . . . . . . . . . . . . Main program . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Transitions (in alphabetical order) . . . . . . . . . . . . . . . . .
77
77 92 95
Chapter 1
Introduction
This project was created for the subject MID-280 Discrete Simulation and performance analysis spring semester 2012 by Vegard Torkelsen and Jonathan Brian T. Alcoriza.
this project the Lego Mindstorms NXT 2.0 will be used, which is the newest set in LEGO's Lego Mindstorms series. The NXT we going to design will have 2 wheels on individual motors and will use an ultrasound sensor to nd the objects.
1.2 Motivation
We are both studying Cybernetics and played with Lego when we were younger. Robots and new technology are interesting for us, and being one of the rst
CHAPTER 1.
INTRODUCTION
students to get to program and try out the Lego NXT at the University of Stavanger is a good opportunity for us to try something new.
Chapter 2
Background
2.1 Lego Mindstorms NXT
Lego Mindstorms NXT [8] is a programmable robotics kit by LEGO. In this project the Lego Mindstorms NXT 2.0 was used, which is the newest set in LEGO's Lego Mindstorms series.
2.1.1
The main component of the Lego Mindstorm kit is the NXT intelligent brick, which is a brick-shaped computer. It can get input from up to four dierent
sensors[3] and three outputs which can control up to 3 motors using RJ12 cables. It has a monochrome screen which is 100x64 pixel and four buttons with menus using hierarchy. The brick also have a speaker that can play sound les.
CHAPTER 2.
BACKGROUND
Technical specications: - 32-bit ARM7 micro controller - 256 KBytes FLASH, 64 KBytes RAM - 8-bit AVR micro controller - 4 KBytes FLASH, 512 Bytes RAM - Bluetooth wireless communication - USB full speed port, 12MBit/s - 4 input ports, 3 output ports, 6-wire cable digital platform - 100 x 64 pixel LCD graphical display - Loudspeaker, 8kHz
2.1.2
Sensors
There are many Lego sensors[3] that can be bought separately and third-party sensors, like color sensors, sound sensor, compass, gyroscopic, RFID reader, accelerometer and temperature measurer.
CHAPTER 2.
BACKGROUND
- Ultrasonic sensor - measures distances in centimeters and in inches and it can detect movement. The sensor can measure distances from 0 to 233 cm with a precision of 3 cm. The ultrasonic sensor measures the distance by calculating the time it takes for a sound wave to hit an object and returns.
- Light sensor - detects light levels by sensing the reected light using a build-in red LED or ambient light. It can also read light intensity of colored surfaces.
CHAPTER 2.
BACKGROUND
10
- The servo motors have built in rotary encoders that senses the rotation of the motor with an accuracy of one degree.
2.2 MATLAB
MATLAB[10] is a high-level programming language for numerical computing, data acquisition and analysis. It can be used to control LEGO NXT robots
over a Bluetooth serial port (serial port communication is part of the base functionality of MATLAB) or via a USB connection. There are several papers found in [14] that take on the topic of LEGO NXT programming in MATLAB.
Mindstorms NXT Bluetooth communication protocol to control the intelligent NXT Brick via a wireless Bluetooth connection or via USB.
CHAPTER 2.
BACKGROUND
11
dynamic systems that are not synchronized (asynchronous, not driven by a clock), but moves forward with the occurrence of events.
2.5 GpenSIM
General purpose Petri-net simulator[1] is a Petri Net simulator, which satisfy the three criteria of exible, extensible and easy to use. GpenSIM is a toolbox in the MATLAB platform. Diverse toolboxes like Fuzzy toolbox and Control system toolbox can be used in the models developed with GpenSIM. There are other tools for discrete event simulation like Automata, Stateow and Petri Net, but unlike GpenSIM, these tools are stand-alone systems and integrating with other types of tools like control systems is not possible.
Chapter 3
Method and design
We used a simple robot with two individually rotating wheels by using one motor for each wheel with a balance wheel behind. The robot has 4 sensors: push sensor, light sensor, ultra sonic sensor and rotational sensor in the servo motors. The push sensor is placed on top and pressing this will start the program. The push sensor was added so that the robot would not start if we accidentally started the program from the computer. The light sensor will be facing downwards to the oor. The light sensor will detect when it is about to drive outside of the search area which are marked up. The ultrasonic sensors is placed were
the front is considered, it is like the eyes of the robot. The ultrasonic sensor are what are used to nd objects. The light sensor does not work well to nd objects, because you will need special lighting conditions and the search objects have to be dierent. The rotational sensor will be used to calculate positions and angles of the robot. We are going to design the robot to have a exible autonomous behavior, meaning that the robot have a main goal which is nding objects in the search area, but we want it to know what to do if it does not nd an object and nds an object outside the search area. We will do this by planning and optimizing action sequence to achieve our goals. We will modify our plans as we go further into the programming. We will improve by learning from experiences by debugging and analyzing past experience to detect potential for improvements and implement the improvements in the system. While testing and analyzing we found that using balls as search object were easiest for the ultrasonic sensors to read when scanning for objects. Another method that can be used, is making the
12
CHAPTER 3.
13
robot move in a specic pattern or follow a path like in [11]. We chose not to use this method since we wanted a more autonomous behaviour.
The variable 'SerialPort' is where the bluetooth is connected, this is the most important part of this le. We used Windows XP were it is a COM-port, in our case COM-port 23. For further explanation see Appendix C.
3.1.2
The main program are in these les found in Appendix D.2, for more information on how to run the program see Appendix B:
CONNECT.m
This le is the initiation le; it connects the NXT to the PC using bluetooth. It implements the bluetooth.ini le. Running this .m-le will reset all connections and sensor and connect the NXT to the bluetooth unit.
NXT_Clawbot_def.m
This is the denition le of the Petri Net. It includes arcs, places and transitions.
CHAPTER 3.
14
NXT_Clawbot.m
initNXT.m
Initiation le for the NXT. It sets the dierent global variables needed to run the robot and sets the dierent ports to motors and sensors. See Appendix D for more information and all the MATLAB code.
3.1.3
Modules
There are several ways to detect objects, in [12] a camera has been used to detect faces in images, a similar technique can be used to detect the balls, if the robot had been tted with a camera. We decide to go with a simpler approach, by using a sonar sensor as mentioned earlier. Another method worth mentioning is simultaneous localization and mapping (SLAM), which is used in [6]. This method would be to complicated and time consuming for us to implement. The scan sequence is the sequence which scans an area in front of the robot for objects. The robot starts by pressing the start button. It will turn 45 The NXT
degrees to the right and start searching while turning to the left.
is limited so it cannot read and send signals to its motors at the same time, so the robots turns to the left in small increments, reads from the ultrasonic sensor (see gure 2.3) and sends the reading to the computer and continues until its turned 90 degrees. For every rotational step, the robot saves the distance returned by the ultrasonic sensor and the distance it has rotated. The distance to the closes object is calculated and the NXT will position itself towards the object by rotating the calculated angle. The angle is calculated by taking all the saved distance points, nd the one closes to the NXT and take all points which only varies +/- 5cm from the closes one and nding the middle one. Then the NXT will drive forward until the calculated distance, grab the object and drive until it nds the edge of the search area (we created a search area by making a circle with A4 paper). When the border of the search area is detected the NXT stops and releases the object, reverses a bit, turns 180 and drives forward and starts the searching loop again. If all scanned points from the ultrasonic sensor are over 100 cm away, all points will be set to zero. This is done so that the robot does not detected too many
CHAPTER 3.
15
objects which are too far away. If the scan area size is increased, the threshold should increase. In the NXT_Clawbot_def.m le, we could not divide into
modules in the code, it would stop when trying to change modules. All the .m-les used can be found in Appendix D. The searching routine can result in four dierent scenarios which we have divided into modules:
Module 1:
Object detected normally: In this module, the robot detects a object in the search area at the initial scan, grabs the object and drops it o at the edge of the search area. After the robot has removed 5 objects, the program will stop.
CHAPTER 3.
16
10. Reverses 11. Turns 180 degrees 12. Drive forward 13. Back to 2 Here we set it to run through the loop 5 times. This can be seen at the arc
The points located between 15 and 25 in gure 3.2 is an object close to the robot, the point in the center of those points are marked red and is the center of the object it found. The robot will use this point to calculate which angle the servo motors have to rotate in order to position itself in front of the object if found in order to drive towards it.
CHAPTER 3.
17
Figure 3.3 is an illustration of the scan where the search object is found in the middle of the scan.
As shown in the code, the tScan1_post. m le calls ACTION_ScanDistance.m, where in the end of the code the edge detector is as seen here:
% check if target is at edge of scan-area. if distance(1) > 0 || distance(2) > 0 || distance(37) > 0 || distance(38) > 0
CHAPTER 3.
18
Here the tDToTarget_pre checks that the edgeDetect from the code above is 0 and that there is a token at p5 as seen here:
function [fire, transition] = tDToTarg_pre(transition) global global_info; b4 = get_place('p5'); if global_info.edgeDetected == 0 && b4.tokens == 1; fire = 1; end
function [fire, transition] = tClose_pre(transition) b5 = get_place('p6'); global global_info; if (b5.tokens && global_info.falseTarget == 0) fire = 1; else fire = 0; end
CHAPTER 3.
19
function [] = tClose_post(transition) pause(5); disp('Target aquired!! These ARE the droids we are looking for'); ACTION_CloseClaw; end
After the robot has picked up the search object tRot180 res and tRot_post calls ACTION_Rotate180.m as shown here, which makes the robot rotate 180 degrees
tBorder res and in tBorder_post ACTION_FindBorder.m is called, which uses the light sensor to nd the border and will make the robot stop before driving outside the search area.
After the border is found tOpen will re and in tOpen_post ACTION_OpenClaw.m is called, as seen here, which opens the claw and releases the search object
CHAPTER 3.
20
The tReverse res and calls ACTION_Reverse.m as seen here, and the robot reverses
Then the robot rotates when tRot1802 calls ACTION_Rotate180.m as seen here
Then it drives forward a bit when tForward calls ACTION_Forward.m when ring as seen here
CHAPTER 3.
21
tBuer is just a transition we added after we could not run without it for some unknown reason
tDone works as a sink and will re when p18 have 5 tokens, which is when 5 objects have been found and placed outside the search area. A typical run is shown in Example run.
Module 2:
The robot detects an object, but it has be detected on either edge of the sensor while in the scan sequence. The robot will therefore position itself towards
where it thinks the ball is, scan again, position itself towards the ball according to the new scan, then move into module 1. This module was created in order to make sure that the robot correctly nds object. While testing, we found that without this module the robot often missed the search object and sometimes bumped it out of the search area.
CHAPTER 3.
22
Object detected on the edge of the sensor: As seen in gure 3.1.3; the object was found in the beginning of the scan. When the robot completes the sequence it will believe that the center of the search object is located in the where the red point is marked. Since the robot might not have scanned the entire search object, this might be inaccurate. It will therefore rotate towards the marked
point, then do another 90 degree scan. This is to ensure that the robot scans the entire search object and get and more accurate reading. The same will happen if the sensor will scan the closes object in the end of the 90 degree scan.
CHAPTER 3.
23
Figure 3.6 is an illustration of when the robot scans and don't scan the whole search object by it being on the edge of the scan. The same will happen if the search object will be scanned on the other edge.
% check if target is at edge of scan-area. if distance(1) > 0 || distance(2) > 0 || distance(37) > 0 || distance(38) > 0 global_info.edgeDetected = 1 else global_info.edgeDetected = 0 end
When there is a token at p5, tEdge is enabled. tEdge_pre.m will check that edgeDetect is 1 and will re as shown here:
CHAPTER 3.
24
function [fire, transition] = tEdge_pre(transition) global global_info; b5 = get_place('p5'); if b5.tokens && global_info.edgeDetected == 1 fire = 1; else fire = 0; end
tScan3 will re and tScan3_post calls ACTION_ScanDistance.m as seen here, which makes the robot scan again to recalculate the position of the search object:
When tToTarg res tToTarg_post.m will call ACTION_RotateToTarget.m as seen here, making the robot turn to the search object
CHAPTER 3.
25
Module 3:
No object detected: All points scanned are over 100 cm; this means that there are no objects in the current scan, and the robot will continue to search towards the left. If the robot has search an entire 360 degree rotation, but has still not found any objects, the robot will move a certain distance forward, then start scanning again.
If an object is found
tTarget will re and the token from P3 will be placed in P4, but if nothing is detected tNoTarg will re and a token will be placed in P14. A new scan will start (the robot just continues to scan to the left) and a token is placed in P15, if an object is found tTarget2 will re and the token from P15 will be placed in
CHAPTER 3.
26
P4, but if nothing is detected tNoTarget2 will re and a token will be placed in P14 and P16. It has now scanned 180 degrees. This will loop until P16 has three tokens and t360Scanned res or the sensor nds something and tTarget2 res and continues like in Module 1:. If t360Scanned res three tokens will be taken from P16 and one will be placed in P17. After that, tForward2 will re making the robot move forward a bit and a token will be placed in P2 and will continue. tCleanUp1 and tCleanUp2 works as sinks, these will take the excess tokens from P16 if tTarget2 res.
Figure 3.8 shows what the sensor read out looks like when no objects are found.
CHAPTER 3.
27
Figure 3.1.3 is an illustration of the search object is outside of the scan of the robot
% % Confirm target detection and plot destination finalTarget= tachoArea(int8(length(tachoArea)/2)); if distance > 100 global_info.targetAcquired = 0; elseif distance < 100 global_info.targetAcquired = 1; plot(finalTarget/5,distance(finalTarget/5),'-*r'); end
CHAPTER 3.
28
function [fire, transition] = tNoTarget_pre(transition) global global_info; % global targetAcquired; b2 = get_place('p3'); if ((b2.tokens == 1) && (global_info.targetAcquired == 0)) fire = 1; else fire = 0; end
When tScan2 res it calls ACTION_ScanDistance2.m in tScan_post.m as shown here, this will make the robot scan another 90 degrees to the left
ACTION_ScanDistance2.m works like ACTION_ScanDistance.m and will also have an targetAcquired like shown above. After tScan2 has red a token is placed in p16 which enables tTarget2 and tNoTarget2, if targetAcquired is 1 tTarget2 will re and it will continue in Module 1:, however if targetAcquired is 0 tNoTarget2 will re putting a token in p16 and one in p14 as seen here:
CHAPTER 3.
29
% global targetAcquired; b14 = get_place('p15'); if ((b14.tokens == 1) && (global_info.targetAcquired == 0)) fire = 1; else fire = 0; end
When a token is placed in p14 tScan2 will be enabled again and make it scan again. If target is acquired tTarget2 will re and it will continue in Module 1:, and tCleanUp1 will also re, sinking the excessive token. If
no target is acquired tNoTarget 2 will re again putting another token in p16. This will continue until a target is acquired or t360Scanned is
enabled when 3 tokens is at p16 and one token is at p14 as shown here:
function [fire, transition] = t360Scanned_pre(transition) display('WE ARE IN T360SCANNED_POST'); b15 = get_place('p16'); b13 = get_place('p14'); fire = (b15.tokens == 3) && b13.tokens;
tForward2 will now be enabled and when ring it calls ACTION_Forward.m as seen here:
CHAPTER 3.
30
Module 4:
Object detected outside search area (false target): While scanning, the robot might detect an object it have already removed from the search area. When this happens, the robot will drive towards the object like in Module 1:, but when the search area border is detected with the light sensor, the robot will stop rotate 180 degrees and search again.
CHAPTER 3.
31
Note: While testing we experienced that the robot got stuck when we made the search area in various shapes. It might get stuck in an innite loop if two search objects are on each side of a corner and the robot is searching between that corner. It will end up with scanning, nding an object, being stopped by the border, turn around, scan again and nd the other object and continue like this until the battery dies. To make sure this does not happen, the search area should be as circular as possible.
Figure 3.11 is an illustration of the NXT robot when it nds an object outside the search area. The search area is a black line and the robot will detect it when driving towards the search object with the light sensor, which is pointing to the oor.
In the end of ACTION_DriveToTarget.m which is called in tDToTarget in will check the light sensor if it nds the border of the search area as seen here:
OpenLight(SENSOR_2, 'ACTIVE');
CHAPTER 3.
32
light = GetLight(SENSOR_2);
if light > 350 global_info.falseTarget = 1; end CloseSensor(SENSOR_2); end end disp('falseTarget'); global_info.falseTarget
When a token is placed in p6 tClose and tFalseTarget is enabled, if the falseTarget is 0, tClose will re and continue like in Module 1:, however if falseTarget is 1 then tFalseTarget will re as shown here:
function [fire, transition] = tFalseTarg_pre(transition) global global_info; b5 = get_place('p6'); if (b5.tokens && global_info.falseTarget == 1) fire = 1; else fire = 0; end
Then tRot1803 res and calls ACTION_Rotate180.m in tRot1803_post.m as shown here, making the robot rotate 180 degrees
CHAPTER 3.
33
Then tForward2 res and calls ACTION_Forward.m in tForward2_post as shown here, making the robot drive forward a bit
CHAPTER 3.
34
Chapter 4
Testing, analyzing and result
In our testing we tried dierent search objects; like cans, balls and building small square pieces with Lego. We found that the balls worked the best for the ultrasonic sensor. In this chapter we will look at the dierent problems and
35
CHAPTER 4.
36
Figure 4.1: Here we assume that the bluetooth connection has been established and that the software has been started according to the user manual. The user will here push the start button and the routine will begin. The rst action the robot will do is a 90 degree scan, illustrated by the blue eld in front of the robot.
CHAPTER 4.
37
Figure 4.2: The robot will here detect the ball and, position itself towards it.
CHAPTER 4.
38
Figure 4.3: Then the ball will drive towards to ball stopping directly in front of it. It will then close its claw, securing the ball
CHAPTER 4.
39
Figure 4.4: After the robot has secured the ball it will turn 180 degrees.
CHAPTER 4.
40
Figure 4.5: Once the robot has repositioned itself it will drive until it reaches the border.
CHAPTER 4.
41
Figure 4.6: As the robot reaches the border, the light sensor will detect the change in color and the robot will stop. When the robot has stopped the claw will open to release the ball.
CHAPTER 4.
42
Figure 4.7: Once the ball is released, the robot will drive backwards, reentering the search area.
CHAPTER 4.
43
Figure 4.8: The robot will then rotate 180 degrees so that it is pointing towards the center of the area. length. After it has rotated, it will drive forward by a xed
CHAPTER 4.
44
CHAPTER 4.
45
Figure 4.9: When the robot has stopped, it will scan again. In this case there are no balls in the search area and it will continue search in 90 degree intervals until it has searched a full 360 degrees.
CHAPTER 4.
46
Figure 4.10: When the robot has scanned 360 degrees around one point without detecting any balls it will drive forward.
CHAPTER 4.
47
Figure 4.11: The robot will then scan a new 90 degree area. Again the robot will fail to detect any balls any will continue scanning another 90 degrees.
CHAPTER 4.
48
Figure 4.12: On the next scan it will detect a ball, however it is on the edge of the search area.
CHAPTER 4.
49
Figure 4.13: The robot will therefor position itself towards where it thinks the ball is an scan again. This is to ensure that it positioned correctly towards the ball.
CHAPTER 4.
50
Figure 4.14: As earlier the robot will now position itself towards the ball, and drive towards it.
CHAPTER 4.
51
Figure 4.15: Once it has reached the ball, it will grab it and rotate 180 degrees.
CHAPTER 4.
52
Figure 4.16: The robot will then drive forward until it reaches the border. Let go of the ball, and reverse as shown earlier.
CHAPTER 4.
53
Figure 4.17: It will again rotate 180 degrees so that it is pointing towards the center of the search area.
CHAPTER 4.
54
CHAPTER 4.
55
Figure 4.18: Again it will start to scan. This time, the robot will have to scan 360 degrees until it nds the last ball.
CHAPTER 4.
56
Figure 4.19: Once the ball is detected it will position itself towards it, and drive forward.
CHAPTER 4.
57
Figure 4.20: When the robot stops, it will close the claw to secure the ball.
CHAPTER 4.
58
Figure 4.21: After the ball has been secured, it will rotate 180 degrees. And drive towards the border. Once the robot has reached to border and release the ball, the program is complete.
CHAPTER 4.
59
Below is a table which show estimate times the robot uses to perform dierent actions. The times here are found during a test where the ball was 20 cm
away from the robot during the scan sequence, and it had to travel 1m to reach the border. The times here are given as a rough estimate. Exact times are not possible to acquire for all actions due to the random nature of the balls locations in the search area. Action performed Normal Scan Reposition towards ball drive to ball (20cm) close claw Drive to border (1m) open claw Drive backwards Rotate 180 degrees Drive forward Time taken 29s 5s 7s 4s 6s 4s 2s 5s 5s
A short video was created that shows the search function of the robot which will be available if reader access to complete zip le
- NXT limitations
The NXT cannot send and receive signals at the same time. Transferring large amount of data is something the NXT cannot handle. One challenge was if the Clean-up robot detected one of the object it had already cleaned up and placed outside the search area. The NXT could not continuously scan to check if it
was within the search area using the light sensor, while scanning for objects using the ultrasonic sensors. This makes the scanning routine very slow. The NXT has only 130 kB storage space, which might not be a big problem since
CHAPTER 4.
60
programs use little storage space, but if desired to have custom sound on the NXT the storage space will be used up with only a few les.
CHAPTER 4.
61
- Lego bricks
We were given a building set with only enough pieces to build the dierent robots from that the set. The robots from that set could not preform the tasks we wanted, they had no grabbing mechanism. The robot we created is not
completely symmetrical in its looks since we did not have enough pieces. There was not enough pieces to build the claw like we wanted; had a desire to build a claw that would lift up the search object when grabbing it.
- Servo motors
Using two individual and unique motors, the do not rotate at the same speed. One motor always start a short time before the other, causing the robot to turn a bit when we program it to go straight forward. Sometimes it has problems with very small rotations. This may cause the robot to miss its intended target.
- Sensors
The ultrasonic sensor has sometimes hard time sensing when it comes to dierent object, for example if you are creating an obstacle avoidance program using the ultrasonic sensor as detection sensor it can avoid a square item which is standing perpendicular to the ultrasonic sensor, but if the nds the object in a dierent angle, then it gets inaccurate reading which may result in the robot bumping into the items.
4.3 Analyzing
By doing test-runs with the robot we have seen that the program itself runs correctly through all stages of the petri-net. However the inaccuracy of the
hardware components on the robot will sometimes lead to failures in the programs functionality. For instance if the robot detects a ball and is to drive
forward to the ball, it might not drive straight enough to be able to position
CHAPTER 4.
62
itself correctly. This may result in a failure to grab the search object, and even though it fails to grab the search object the NXT will continue without removing the search object from the search area. If program is set to nd a specic amount of search objects, this will result in a search object is left in the search area. When the robot has found an object, grabbed it and is on its way to put it outside the search area, if there are some objects in the way, the robot will simply drive straight towards it and hitting it. While testing No module was for created a scenario where two search objects were place right next to each other. If this scenario would happen then the program created
would just read the two objects as one big object and drive in the middle of them, picking up both or none. While testing, the search objects would always be placed apart from each other so this would not happen.
4.4 Result
The clean-up robot got completed and works like it is suppose to, but with the limitations and challenges mentioned. The routine is fully autonomous like we wanted it to be. In ideal conditions (set search environment, search objects
placed apart and good lighting) the robot can run by its self. With the limited resources and time we are satised with the outcome. We have reached the
conclusion that using a model and simulation approach to program a robot in this intended level of autonomy will be too poor compared to other types of programming. In this report we included a building instruction on how to build the exact model we created found in Building guide, a full user guide on how to run the program we created with the robot model found in User manual for Lego NXT Clean-up robot and installation guide on what programs are needed to run the program created found in C.
Chapter 5
Discussion
Discussion of the limitation and the challenges and possible solutions.
5.1 Connection
Since bluetooth is the only wireless connection available on the NXT kit we are bound to use it if we don't want a long USB cable hanging from it. Bluetooth makes it dicult to send and receive data at the same time. The NXT brick also has a 30ms latency when it switches from receiving data to sending data which means that the expected latency for a sensor reading request is 60 ms. If this was on an industrial level, it might be a problem and even dangerous. On [14] it is recommended to use USB for real time programming, but then we would need a long USB cable and the robot might detect the cable as an obstacle. A WiFi connection would be more preferable, since it process more data and also easier to connect..
63
CHAPTER 5.
DISCUSSION
64
have to). Petri nets are more suited for concurrent programming, which in this case it is not. Creating a program directly in MATLAB would be easier and possibly better.
Speed
The robot has a slow search routine since its calculations are slow and the NXT cannot process large amounts of data at the same time. This can be
compensated by adding more robots to the search area, this can be controlled using swarm intelligence[9], where the robots share their current position so each robot searches a distinct part of the search area, and avoids colliding with each other. This will severely decrease the time it takes to clear the area of objects.
Increased accuracy
One of the most important things that this robot needs is an increase in accuracy. The robot needs mainly two things to increase its performance, increase in measurements from the sonar, so that the robot will stop at the correct distance from the ball. And increase the motorcontrol so that the robot can drive straighter than it currently does, this is to ensure that the robot actually hits its designated target. This can be achieved by scanning for the ball several times as the robot is advancing towards it. This will however severely increase the
time the robot uses to clear objects from the search area. Another method that could be used is to add use a camera, either on the robot or externally with a overhead view and using object detection[12] algorithms. Some of these methods are covered in [15], but here the program was not created with GpenSIM, but directly in MATLAB.
Color detection
The robot can be tted with a color sensor, so that it can remove balls with a certain color.
CHAPTER 5.
DISCUSSION
65
Obstacle detection
For the robot to be fully autonomous, it would need an obstacle detection sensor. All 4 sensor ports were used in this robot, but by adding an additional NXT programmable brick, you will get another 4 sensor ports.
Communication
As mentioned earlier in this report we had severe problems with the bluetooth connection. Improving this connection or nding alternative forms of communication would make the robot a lot more user friendly.
Tracking
It would be ideal if the robot could remember where it has searched before so it wouldn't have to search there again, which would increase the speed of the routine. And it also would be better if the robot could place all the search
When building the robot a dierent claw design was desired; one where the claw would lift up the object the robot found. We would also like the claw to be bigger, so accuracy will be better.
Chapter 6
Image Reference
Figures 3.2, 3.1.3 and 3.8 are graphs showing the ultrasonic sensor readouts that were sent to MATLAB
Figures 3.12, 3.1, 3.1.3, 3.7 and 3.10 are drawn using PIPE2 Figures 2.3, 2.4, 2.2 and 2.5 found at
http://www.brickset.com/browse/themes/?theme=Mindstorms&subtheme=NXT
Figures 3.3, 3.6, 3.1.3 and 3.11 the Lego model was created using Lego Digital Designer
1 PIPE2 2 LEGO
is an open source, platform independent tool for creating and analyzing Petri nets
including Generalized Stochastic Petri nets. Digital Designer, or LDD, is a free computer program produced by the LEGO
Group as a part of LEGO Design byME. The program allows users to build models using virtual LEGO bricks, in a computer-aided design like manner. Found here: http://ldd.lego.com/
3 Paint.NET
is free image and photo editing software for computers that run Windows.
66
Bibliography
[1] Reggie Davidrajuh.
GPenSIM: A New Petri Net Simulator, Petri Nets Applications, Pawel Pawlewski (Ed.), ISBN: 978-953-307-047-6. InTech,
2010. The support of autonomy and the Vol
53(6):10241037, Dec 1987. [3] Wikipedia The Free Encyclopedia. "lego mindstorms nxt - sensors".
http://en.wikipedia.org/wiki/Lego_Mindstorms_NXT. [4] Roar Fjellheim. Ai and autonomy in oil and gas, March 2012. NFA
Autonomy in oil and gas industry. [5] Hassane Alla Francois Charbonnier and Ren David. The supervised control of discrete-event dynamic systems. 7 no. 2:1, March 1999. [6] Staan Ekvall; Danica Kragic; Patric Jensfelt;. Object detection and mapping for service robot tasks. 2007. [7] Lego. Lego digital designer. http://ldd.lego.com/. [8] Lego. "nxt" retrieved april 13, 2012 from lego.com.
Robotica,
http://mindstorms.lego.com/en-us/whatisnxt/default.aspx. [9] Alcherio Martinoli. Swarm intelligence in autonomous collective robotics: From tools to the analysis and synthesis of distributed control strategies. 1999. [10] MathWorks. Matlab the language of technical computing.
http://www.mathworks.se/products/matlab/. 67
BIBLIOGRAPHY
68
[11] Robert W. Hogg; Arturo L. Rankin; Stergios I. Roumeliotis; Michael C. McHenry; Daniel M. Helmick; Charles E Bergh; Larry Matthies. gorithms and sensors for small robot path following. Al-
Proceedings of the 2002 IEEE International Conference on Robotics 8 Automation, May 2002.
[12] Constantine P. Papageorgiou; Michael Oren; Tomaso Poggio;. framework for object detection. A general
Center for Biological and Computational Learning , Articial Intelligence Laboratory , MIT.
[13] RWTH. Install guide. http://www.mindstorms.rwthaachen.de/trac/wiki/Download#InstallationGuide. [14] RWTH Aachen University. "rwth - mindstorms nxt toolbox for matlab" retrieved february 6, 2012 from:. http://www.mindstorms.rwth-aachen.de/. [15] Marco Casini; Andrea Garulli; Antonio Giannitrapani; Antonio Vicino. A matlab-based remote lab for multi-robot experiments.
Appendix A
Building guide
See additional le Building Instructions [claw].html
69
Appendix B
User manual for Lego NXT Clean-up robot
B.1 Introduction
This is a small user manual for the Lego NXT Clean-up robot. If you have not read the installation guide, do it now. The Clean-up robot is extremely easy to use once you have been able to get the bluetooth connection up and running.
70
APPENDIX B.
71
APPENDIX B.
72
manual you should have the robot connected to the computer over bluetooth, if not run the connect.m le, make sure that the motorcontroller program is running on the NXT. Open the NXT_ClawBot_def.m and change the variable numberOfBalls to the number of balls that you have on your search area, then place your robot near the border of the search area, facing inwards. Run the les NXT_ClawBot.m, you should hear a beep from the robot to conrm that the program has started. When you are ready press the push button on the robot and stand back. The robot will now start to remove the balls from the search area and it will stop when all the balls have been removed.
Appendix C
Installation guide for Lego NXT Clean-up robot
Building instructions for the robot can be found in Appendix 1B and Appendix 2B. There you will nd a HTML version and a Lego digital designer le, in the latter you can view the build instructions while being able to rotate the model. To be able to use this, you have to download and install Lego digital designer, found at [7].
address. adapter.
Also make sure that you have updated the driver for the bluetooth
1. Place the program les in a suitable folder. 2. In MATLAB, go to File -> Set Path and select the folder where you placed the les. Then click Add with sub folder.
COM ports.
To nd your com port, you can double click the bluetooth icon on the right part of your task bar. The click the COM Ports tab. Here you should see two COM ports listed, one for outgoing and one for incoming. The bluetooth.ini le needs to have the outgoing com port addressed to it. If there are only one COM port listed, you have to add the second one, simply press add and select the NXT.
MAC address
To nd the correct MAC address, you need to press start -> run, type in cmd. In the window that opens type in ipcong /all. On the following information look for physical address There should be 12 letters and numbers following this line, that is your mac address. Example: Physical address . . . . . . 00-0F-FE-54-37-C1
conrm the passkey 1234. There should now pop up a bluetooth notication on your computer, enter the same passkey and hopefully you have been able to connect the NXT to your computer through bluetooth. If this fails, you can try to go the other way, by searching for bluetooth units on your computer. To do this double click the bluetooth icon on the right on your task bar and select add. Make sure that the NXTs bluetooth is turned on, check the box that says My device is set up and ready to be found and press next. Once you nd the NXT, double click it, you will now get a similar passkey on the computer and the NXT, enter 1234 as before. When you have done this, double check the COM-ports and mac address, if they have changed, change them accordingly in the bluetooth.ini le. Once you have done all this you can go on the NXT to Bluetooth and then to My Contacts. The computer should appear here now. Run the connect.m le in MATLAB and select the computer on the NXT, press connect on the NXT and select 1. If everything was successful you should now have connected the NXT to the computer. One way to verify that you have been able to connect, is to check that the
on the NXT.
Keep trying to connect as explained before, also rebooting the NXT once in a while.
Try pairing the bluetooth and the computer all over again, remember to check MAC address / COM ports.
It also appears that Mac computers handle bluetooth much better than windows, if you have one available I highly suggest you try to use it rst
Appendix D
MATLAB code
D.1 Action Commands
The software has a series of les called action commands; these tell the robot what actions it should take. Below is a list of all the les, and what actions the robot will do.
ACTION_CloseClaw.m
This action will rotate the center motor, causing the claw to go from open position to closed position, enabling the robot to grab the spheres lying around the search area. It is important that this command is not sent to the robot while the claw is already closed, this will cause the motor to try to reach a position it is not able to go to.
global portC; global global_info; mClaw mClaw.SpeedRegulation mClaw.Power mClaw.TachoLimit mClaw.ActionAtTachoLimit = = NXTMotor(global_info.portC); % = true; % = 20; % positive = close = 35; 'Brake';
77
APPENDIX D.
MATLAB CODE
78
mClaw.SendToNXT();
ACTION_OpenClaw.m
Similar to ACTION_CloseClaw.m however, the claw will now open.
global portC; global global_info; pause(1) mClaw mClaw.SpeedRegulation mClaw.Power mClaw.TachoLimit mClaw.ActionAtTachoLimit = mClaw.SendToNXT();
ACTION_DriveToTarget.m
The DriveToTarget action will drive forward towards the search object so that it is within reachable distance for the claws. The robot will stop at given distances and scan the light level of the surface it is currently on, this is to ensure that the robot does not exit the search area. calculated in the scanDistance actions. The distance it will drive has been
%drive to target global global_info; global MOTOR_A global portM global finalTarget global distance global falseTarget falseTarget = 0; drivenDistance = 0; targetReached = 0; while targetReached == 0 && falseTarget == 0;
APPENDIX D.
MATLAB CODE
79
if (distance(finalTarget/5)*18) - drivenDistance > 360 mTurn1 = NXTMotor(global_info.portM); % ports swapped because it's nicer mTurn1.SpeedRegulation = false; % we could use it if we wanted mTurn1.Power = 80; mTurn1.TachoLimit = 360; mTurn1.ActionAtTachoLimit = 'Brake'; mTurn1.SendToNXT(); pause(2) drivenDistance = drivenDistance + 360; OpenLight(SENSOR_2, 'ACTIVE'); light = GetLight(SENSOR_2)
if light > 350 targetReached = 1; global_info.falseTarget = 1; end CloseSensor(SENSOR_2); elseif(((distance(finalTarget/5)*18) - drivenDistance <= 360) && falseTarget == 0) mTurn1 = NXTMotor(global_info.portM); % ports swapped because it's nicer mTurn1.SpeedRegulation = false; % we could use it if we wanted mTurn1.Power = 80; mTurn1.TachoLimit = ((distance(finalTarget/5)*18) - drivenDistance+5); %mTurn1.ActionAtTachoLimit = 'Brake'; mTurn1.SendToNXT(); pause(2) targetReached = 1; OpenLight(SENSOR_2, 'ACTIVE'); light = GetLight(SENSOR_2);
APPENDIX D.
MATLAB CODE
80
trigger = 1; while trigger; light = GetLight(SENSOR_2); if light < 350; mTurn1 mTurn1.SpeedRegulation mTurn1.Power mTurn1.TachoLimit mTurn1.ActionAtTachoLimit = mTurn1.SendToNXT(); elseif light > 350;
= NXTMotor(global_info.portM); % ports swapped because i = false; % we could use it if we wanted = 80; = 0; 'Brake';
APPENDIX D.
MATLAB CODE
81
mTurn1 mTurn1.SpeedRegulation mTurn1.Power mTurn1.TachoLimit mTurn1.ActionAtTachoLimit = mTurn1.SendToNXT(); trigger = 0; end end; CloseSensor(SENSOR_2);
ACTION_Forward.m
= NXTMotor(global_info.portM); % ports swapped because i = false; % we could use it if we wanted = 80; = 1; 'Brake';
When given this command, the robot will simply drive forward a set distance. This is typically used after it has completed a full 360 degree scan without nding any objects.
This action is similar to ACTION_Forward but will go backwards instead of forward. This is used after the robot has removed a ball from the search area,
APPENDIX D.
MATLAB CODE
82
= NXTMotor(global_info.portM); % ports swapped because it's nice = false; % we could use it if we wanted = -80; = 600; 'Brake';
When the ACTION_Rotate180 command is sent to the robot it will rotate 180 degrees. This is typically used after the robot has removed a ball from the
search area and backed away from the ball. This will position the robot so that it is facing towards the center of the search area.
global global_info % Turn 90 degress to right pause(1) mTurn1 mTurn1.SpeedRegulation mTurn1.Power mTurn1.TachoLimit mTurn1.ActionAtTachoLimit = mTurn1.SendToNXT(); mTurn1 mTurn1.SpeedRegulation
= NXTMotor(global_info.portM(1)); % ports swapped because it's n = false; % we could use it if we wanted = -40; = 376; 'Brake';
APPENDIX D.
MATLAB CODE
83
robot has rotated when each distance measurement has been taken. By doing this, we know how far the robot has to rotate to be able to face to object.
%turn to destination global global global global global_info; tachoLength; finalTarget; distance;
= NXTMotor(global_info.portM(1)); % ports swapped because it's n = false; % we could use it if we wanted = -30; = (max(tachoLength)-finalTarget)+10; 'Brake';
mTurn1 = NXTMotor(global_info.portM(2)); % ports swapped because it's n mTurn1.SpeedRegulation = false; % we could use it if we wanted mTurn1.Power = 30; mTurn1.TachoLimit = (max(tachoLength)-finalTarget)+10; mTurn1.ActionAtTachoLimit = 'Brake'; mTurn1.SendToNXT(); pause(3) %check distance to target. OpenUltrasonic(global_info.portUS, 'snapshot');
APPENDIX D.
MATLAB CODE
84
ACTION_ScanDistance.m
When this action is sent to the robot it will rst rotate 45 degrees to the right, then rotate 90 degrees to the left. While rotating to the left the robot will stop at certain intervals, scan in front of it, save the distance which is returned from the scan and how far it has rotated at that scan. When it has nished rotating it will check if it has detected any objects.
function [tachoLength,finalTarget] = scanDistance() % portUS = SENSOR_4; % PortM = [MOTOR_A,MOTOR_B]; % PortC = [MOTOR_C]; % portA = [SENSOR_1]; global global global global global global global global_info h; tachoLength; finalTarget; distance; targetAcquired; edgeDetected;
global_info.tachoLength =
zeros(500,1);
APPENDIX D.
MATLAB CODE
85
global_info.distance = zeros(500,1); n = 8; % bytes the US sensor received count = 38; % how many readings until end? plotcols = 8; % how many out of n echos to plot? outOfRange = 160; % setting for out of range readings targetAcquired = 0; data = zeros(1, n); allX = (1:count+1)'; edgeDetected = 0; % Turn 45 degress to right pause(1) mTurn1 mTurn1.SpeedRegulation mTurn1.Power mTurn1.TachoLimit mTurn1.ActionAtTachoLimit = mTurn1.SendToNXT(); mTurn1 mTurn1.SpeedRegulation mTurn1.Power mTurn1.TachoLimit mTurn1.ActionAtTachoLimit = mTurn1.SendToNXT(); pause(3) %start sweep and search for i = 1 : count mTurn1 mTurn1.SpeedRegulation mTurn1.Power mTurn1.TachoLimit mTurn1.ActionAtTachoLimit = mTurn1.SendToNXT();
= NXTMotor(global_info.portM(1)); % ports swapped because it's n = false; % we could use it if we wanted = -40; = 94; 'Brake';
= NXTMotor(global_info.portM(2)); % ports swapped because it's n = false; % we could use it if we wanted = 40; = 94; 'Brake';
= NXTMotor(global_info.portM(1)); % ports swapped because it = false; % we could use it if we wanted = 20; = 5; 'Brake';
APPENDIX D.
MATLAB CODE
86
= NXTMotor(global_info.portM(2)); % ports swapped because it = false; % we could use it if we wanted = -20; = 5; 'Brake';
%determine closest target global target target = min(distance(1:i)); %remove other measurements ( above 100cm) for j=1:i; if distance(j) > target +5 || distance(j) < target - 5 distance (j) = 0; elseif distance(j) == 100 distance(j) = 0; end
APPENDIX D.
MATLAB CODE
87
end
v = 1; targetArea = zeros(1); tachoArea = zeros(1); %find area of block for u=1:i; if distance(u) > 0; temp = zeros(length(targetArea),1); temp2 = temp; temp = targetArea; temp2 = tachoArea; targetArea = zeros(length(temp)+1); targetArea = temp; tachoArea = zeros(length(temp2)+1); tachoArea = temp2; clear temp; clear temp2; targetArea(v) = distance(u); tachoArea(v) = tachoLength(u); v = v + 1; end end %Confirm target detection and plot destination finalTarget= tachoArea(int8(length(tachoArea)/2)) if distance > 100 global_info.targetAcquired = 0 elseif distance < 100
APPENDIX D.
MATLAB CODE
88
global_info.targetAcquired = 1 plot(finalTarget/5,distance(finalTarget/5),'-*r'); end % check if target is at edge of scan-area. if distance(1) > 0 || distance(2) > 0 || distance(37) > 0 || distance(38) > 0 global_info.edgeDetected = 1 else global_info.edgeDetected = 0 end
ACTION_ScanDistance2.m
This action is used if the ACTION_ScanDistance.m does not detect any objects. The robot will then rotate another 90 degrees while scanning, saving the distances and how far it has rotated. This action will continue to run until the robot detects an object, or until the robot has rotated 360 degrees.
function [tachoLength,finalTarget] = scanDistance() % portUS = SENSOR_4; % PortM = [MOTOR_A,MOTOR_B]; % PortC = [MOTOR_C]; % portA = [SENSOR_1]; global global global global global global global_info h; tachoLength; finalTarget; distance; targetAcquired;
APPENDIX D.
MATLAB CODE
89
grid on global_info.tachoLength = zeros(500,1); global_info.distance = zeros(500,1); n = 8; % bytes the US sensor received count = 38; % how many readings until end? plotcols = 8; % how many out of n echos to plot? outOfRange = 160; % setting for out of range readings targetAcquired = 0; data = zeros(1, n); allX = (1:count+1)';
%start sweep and search for i = 1 : count mTurn1 mTurn1.SpeedRegulation mTurn1.Power mTurn1.TachoLimit mTurn1.ActionAtTachoLimit = mTurn1.SendToNXT(); mTurn1 mTurn1.SpeedRegulation mTurn1.Power mTurn1.TachoLimit mTurn1.ActionAtTachoLimit = mTurn1.SendToNXT(); pause(0.8)
= NXTMotor(global_info.portM(1)); % ports swapped because it's n = false; % we could use it if we wanted = 20; = 5; 'Brake';
= NXTMotor(global_info.portM(2)); % ports swapped because it's n = false; % we could use it if we wanted = -20; = 5; 'Brake';
APPENDIX D.
MATLAB CODE
90
%determine closest target global target target = min(distance(1:i)); %remove other measurements ( above 100cm) for j=1:i; if distance(j) > target +5 || distance(j) < target - 5 distance (j) = 0; elseif distance(j) == 100 distance(j) = 0; end end
v = 1; targetArea = zeros(1); tachoArea = zeros(1); %find area of block for u=1:i; if distance(u) > 0; temp = zeros(length(targetArea),1); temp2 = temp; temp = targetArea; temp2 = tachoArea;
APPENDIX D.
MATLAB CODE
91
targetArea = zeros(length(temp)+1); targetArea = temp; tachoArea = zeros(length(temp2)+1); tachoArea = temp2; clear temp; clear temp2; targetArea(v) = distance(u); tachoArea(v) = tachoLength(u); v = v + 1; end end % % Confirm target detection and plot destination finalTarget= tachoArea(int8(length(tachoArea)/2)); if distance > 100 global_info.targetAcquired = 0; elseif distance < 100 global_info.targetAcquired = 1; plot(finalTarget/5,distance(finalTarget/5),'-*r'); end
APPENDIX D.
MATLAB CODE
92
NXT_Clawbot.m
global global_info; global_info.REAL_TIME = 1; % This is a Real-Time run global_info.STOP_AT = current_clock(3) + [0 20 0]; % stop after 2 mins global h;
h = initNXT(); png = petrinetgraph({'NXT_test_def'}); dynamicpart.initial_markings = {'p1',1,'p2',0,'p3',0,'p4',0,... 'p5',0,'p6',0,'p7',0,'p8',0,'p9',0,'p10',0,'p11',0,'p12',0,... 'p13',0,'p14',0,'p15',0,'p16',0,'p17',0,'p18',0,'p19',0,... 'p20',0,'p21',0,'p22',0,'p23',0}; dyn.initial_priority = {'tDone',2,'tScan',1,'tCleanup3',3, 'tCleanup2',2,'tCleanup1',1,'tFalseTar',2, 'tClose',1,'tEdge',2,'tDToTarg',1}; % initial priority dynamicpart.firing_times = {'tStart',2,'tScan1',2,'tToTarg',2, 'tDToTarg',2,'tClose',1,'tRot180',2, 'tBorder',2,'tOpen',2,'tReverse',2,... 'tRot1802',1,'tBuffer',1,'tTarget',1, 'tNoTarget',1,'tScan2',1,'tTarget2',1, 'tNoTarget2',1,'t360Scanned',1,'tForward',1, 'tForward2',1,... 'tDone',1,'tFalseTarg',1,'tEdge',1,'tToTarg2',1,'tScan3',1}; display('Push button to start clean-up procedure!'); % dynamicpart.initial_priority = {'tPEDES_START',5, 'tPEDES_CYCLE',5,... % 'tEM_START',5, 'tEM_CYCLE',5};
APPENDIX D.
MATLAB CODE
93
%disp('System is ready ...'); sim = gpensim(png, dynamicpart); % % % close_TL_NXT(); % Never forget to clean up after your work!!! % % print_statespace(Sim_Results); % plotp(Sim_Results, {'p1','p2','p3'});
NXT_Clawbot_def.m
function [png] = NXT_test_def() png.PN_name = 'PDF for nxt test'; png.set_of_Ps = {'p1','p2','p3','p4', 'p5','p6','p7','p8', ... 'p9','p10','p11','p12', 'p13','p14','p15','p16','p17',... 'p18','p19','p20','p21','p22','p23'}; png.set_of_Ts = {'tStart','tScan1','tToTarg', 'tDToTarg',... 'tClose','tRot180','tBorder', 'tOpen','tReverse',... 'tRot1802','tForward','tBuffer','tTarget','tNoTarget',... 'tScan2','tTarget2','tNoTarget2','t360Scanned','tForward2',... 'tDone','tCleanUp1','tCleanUp2','tCleanUp3','tFalseTarg',... 'tRot1803','tForward3','tEdge','tToTarg2','tScan3'};
APPENDIX D.
MATLAB CODE
94
'tBorder',1,... 'tBorder','p9',1,'p9','tOpen',1,'tOpen','p10',1,'p10','tReverse',1,'tReverse','p11',1,.. 'p11','tRot1802',1,'tRot1802','p12',1,'p12','tForward',1,'tForward','p13',1, 'p13','tBuffer',1,... 'tBuffer','p2',1,... 'p3','tNoTarget',1,'tNoTarget','p14',1,'p14','tScan2',1,'tScan2','p15',1, 'p15','tTarget2',1,... 'tTarget2','p4',1,'p15','tNoTarget2',1,'tNoTarget2','p14',1,'tNoTarget2','p16',1, 'p16','t360Scanned',3,... 'p14','t360Scanned',1,'t360Scanned','p17',1,'p17','tForward2',1,'tForward2','p2',1,... 'tBuffer','p18',1,'p18','tDone',3,'tDone','p19',1,'p2','tDone',1,... 'p16','tCleanUp1',1,'p16','tCleanUp2',2,'p16','tCleanUp3',3,... 'p6','tFalseTarg',1,'tFalseTarg','p20',1,'p20','tRot1803',1,'tRot1803','p21',1,... 'p21','tForward3',1,'tForward3','p2',1,... 'p5','tEdge',1,'tEdge','p22',1,'p22','tScan3',1,'tScan3','p23',1,... 'p23','tToTarg2',1,'tToTarg2','p5',1};
APPENDIX D.
MATLAB CODE
95
t360Scanned_pre.m
function [fire, transition] = t360Scanned_pre(transition) display('WE ARE IN T360SCANNED_POST'); b15 = get_place('p16'); b13 = get_place('p14'); fire = (b15.tokens == 3) && b13.tokens;
t360Scanned_post.m
tBorder_pre.m
tBorder_post.m
tBuer_pre.m
APPENDIX D.
MATLAB CODE
96
tBuer_post.m
tCleanUp1_pre.m
tCleanUp2_pre.m
fire = b3.tokens;
tClose_pre.m
function [fire, transition] = tClose_pre(transition) b5 = get_place('p6'); global global_info; if (b5.tokens && global_info.falseTarget == 0) fire = 1; else fire = 0; end
tClose_post.m
function [] = tClose_post(transition) pause(5); disp('Target aquired!! These ARE the droids we are looking for'); ACTION_CloseClaw; end
APPENDIX D.
MATLAB CODE
97
tDToTarget_pre.m
function [fire, transition] = tDToTarg_pre(transition) global global_info; b4 = get_place('p5'); if global_info.edgeDetected == 0 && b4.tokens == 1; fire = 1; end
tDToTarget_post.m
tEdge_pre.m
function [fire, transition] = tEdge_pre(transition) global global_info; b5 = get_place('p5'); if b5.tokens && global_info.edgeDetected == 1 fire = 1; else fire = 0; end
tEdge_post.m
APPENDIX D.
MATLAB CODE
98
tFalseTarg_pre.m
function [fire, transition] = tFalseTarg_pre(transition) global global_info; b5 = get_place('p6'); if (b5.tokens && global_info.falseTarget == 1) fire = 1; else fire = 0; end
tFalseTarg_post.m
tForward_pre.m
function [fire, transition] = tForward_pre(transition) global global_info; b11 = get_place('p12'); fire = b11.tokens;
tForward_post.m
APPENDIX D.
MATLAB CODE
99
tForward2_pre.m
function [fire, transition] = tForward_pre(transition) global global_info; b16 = get_place('p17'); fire = b16.tokens;
tForward2_post.m
tForward3_pre.m
function [fire, transition] = tForward3_pre(transition) global global_info; % global targetAcquired; b20 = get_place('p21'); fire = b20.tokens;
tForward3_post.m
tNoTarget_pre.m
APPENDIX D.
MATLAB CODE
100
global global_info; % global targetAcquired; b2 = get_place('p3'); if ((b2.tokens == 1) && (global_info.targetAcquired == 0)) fire = 1; else fire = 0; end
tNoTarget_post.m
tNoTarget2_pre.m
function [fire, transition] = tNoTarget2_pre(transition) global global_info; % global targetAcquired; b14 = get_place('p15'); if ((b14.tokens == 1) && (global_info.targetAcquired == 0)) fire = 1; else fire = 0; end
APPENDIX D.
MATLAB CODE
101
tNoTarget2_post.m
tOpen_pre.m
tOpen_post.m
tReverse_pre.m
tReverse_post.m
APPENDIX D.
MATLAB CODE
102
tRot180_pre.m
tRot180_post.m
tRot1802_pre.m
tRot1802_post.m
tRot1803_pre.m
function [fire, transition] = tRot1803_pre(transition) global global_info; % global targetAcquired; b19 = get_place('p20'); fire = b19.tokens
APPENDIX D.
MATLAB CODE
103
tRot1803_post.m
tScan1_pre.m
tScan1_post.m
tScan2_pre.m
function [fire, transition] = tScan2_pre(transition) b13 = get_place('p14'); b15 = get_place('p16'); fire = b13.tokens & b15.tokens < 3;
tScan2_post.m
APPENDIX D.
MATLAB CODE
104
tScan3_pre.m
tScan3_post.m
tStart_pre.m
function [fire, transition] = tStart_pre(transition) global global_info; OpenSwitch(global_info.portB); button = GetSwitch(global_info.portB); CloseSensor(global_info.portB); if button == 1 fire = 1; else fire = 0; end
tStart_post.m
APPENDIX D.
MATLAB CODE
105
tTarget_pre.m
tTarget_post.m
tTarget2_pre.m
function [fire, transition] = tTarget2_pre(transition) global global_info; % global targetAcquired; b14 = get_place('p15');
APPENDIX D.
MATLAB CODE
106
tTarget2_post.m
tToTarg_pre.m
tToTarg_post.m
tToTarg2_pre.m
tToTarg2_post.m