forked from harsha2512/CMU549_DTR
-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathuseCases.tpl.php
31 lines (19 loc) · 1.91 KB
/
useCases.tpl.php
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
<p>Robotics has a very hard technical wall to get over for something like basic movement, thus preventing many at lower educational levels or technical knowledge from experimenting with some of the aspects that require less technical knowledge.</p>
<p>The primary aspect we are focusing on making more readily available to experiment with is scheduling and routing multiple robots.</p>
<p>Some of the challenges we are streamlining are having various sensors that will automatically stop the robot from making collisions with each other, large objects or possibly making a dangerous turn off a ledge.</p>
<p>This would typically require knowledge of embedded sensors and hardware interrupts just to safely move a robot across a table.</p>
<p>Now users can experiment with our system by simply clicking where they want the robots to go on our video stream and watch them safely get to the access point.</p>
<p>With some basic scripting they can play around with simple algorithms for getting the robot to follow a path or even do certain patterns.</p>
<p>The world this opens to secondary education in a technical field is drastic as it allows a major foot in the door of robotics that didn't exist before.</p>
<p>Below are some examples of where this technology could be used.</p>
<ul class="competitors">
<li>
<h3>Terrain Coverage</h3>
<p>The primary example has been tracking terrain coverage. This could be useful in a house with cleaning robots, where the user wants to see what parts of the floor the robot has actually covered.</p>
</li>
<li>
<h3>Interactive AI Testing</h3>
<p>This set up could also be used to test various robot AI's, with the robot relaying the information it senses back to the Central Hub, which can display it in an informative way.</p>
<p>Robotic programs could then be debugged, with users clearly seeing what the robot is thinking as it moves about the environment.</p>
</li>
</ul>