-
Notifications
You must be signed in to change notification settings - Fork 30
Coursework 2425
Read the Assessment Item 1 Coursework page on Blackboard! Make sure you read the assignment brief and CRG and understand it. Make sure you also understand the 3 different complexity levels. Focus on making a good solution for the simple level first! Ask the module team if anything is unclear!
The assignment is meant to be challenging and open-ended.
There is no single "correct" solution you need to find, there are possibly very many. You should be innovative and creative to devise algorithmic solutions that can solve the problem given. You will be rewarded for good ideas, a coherent, comprehensive concept and implementation. Remember: The actual robot performance (i.e., if it pushes all objects where they should go) is only a small fraction of your mark. More important is that you submit working code, make clever use of the robot's sensing capabilities, and overall present a coherent and well-structured solution. You should focus on a well-thought-out solution and be confident in understanding and presenting it.
You are allowed (even encouraged) to build your solution on code fragments presented in the module, and also code you find online as long as you reference it in the comments in your source code. Source code that we find in your solution that is copied from available solutions online or elsewhere without a reference are an academic plagiarism offence and may be prosecuted in line with the university policy. Simply, don't do it, reference it (the URL to the original version is enough for source code, don't worry about Harvard referencing). We also run automated code structure analysis across all submissions (past and present) which usually picks up collusion (i.e. two students submitting the same code with changed variable names and comments, etc) quite reliably. Don't do it.
That said, you should not worry, if you are do nothing wrong you have nothing to fear. Give your utmost best and you'll be rewarded.
Here are a few quick reminders and commands you will likely use regularly:
-
Using the Docker Image describes something you should be very familiar with by now. Your solution must work in the provided devcontainer as that's where it will be tested in.
- Make sure you have the latest version of the devcontainer with all features. To remind you how to get it:
docker pull lcas.lincoln.ac.uk/devcontainer/ros2-teaching:4
- Then you are ready to launch VSCode with your configured repository which you derived from the module's template repository during the week 1 workshop (watch the recording of it if you missed it!).
- Make sure you have the latest version of the devcontainer with all features. To remind you how to get it:
- Simulation:
- You will find that the new simulation environment has some coloured horizontal "markers" on both sides (here shown for the simple simulation environment without obstacles):
These markers are indicating where the coloured objects should be pushed towards only for the complexity level 3 (see Assessment Brief). If you don't want to implement this higher level of functionality, you must still ensure you don't confuse them with the boxes having the same colour!
- You will find that the new simulation environment has some coloured horizontal "markers" on both sides (here shown for the simple simulation environment without obstacles):
For the coursework assignment, a number of different simulation worlds have been generated and released.
-
For the simple environment (no obstacles) you do this with this command:
ros2 launch uol_tidybot tidybot.launch.py
. The environment has already one single box in front of the robot to test the most basic functionality of your code -
You should add a number of green cubes, placed randomly before starting your behaviour:
ros2 run uol_tidybot generate_objects --ros-args -p n_objects:=10
This will generate 10 cubes randomly in the enironment. You can change the number with the parameter
n_objects
. -
Remember, your job is to push all of them to any wall.
- This is an extension of Level 1, so what was said there still applies in general and your solution is expected to also work in the simulation of Level 1.
- There are 3 additional pre-defined environments for level 2, highlighting that your solution must be flexible enough to deal with different static obstacles (i.e. their position must not be assumed fixed in all setups, the walls however are always the same).
- Your solution should work on all of them as it will be tested in one of them chosen randomly.
- Here is how to launch the simulation environments for the three different worlds:
-
ros2 launch uol_tidybot tidybot.launch.py world:=level_2_1.world
looks like this:Note the
world
parameter defining the world you launch. -
ros2 launch uol_tidybot tidybot.launch.py world:=level_2_2.world
looks like this: -
ros2 launch uol_tidybot tidybot.launch.py world:=level_2_3.world
looks like this:
-
- This is an extension of Level 1 & 2. You will use the same simulation worlds as above, and your solution is expected to perform in any of them.
- But in addition you will make use of the markers at the top of the 2 walls. Instead of pushing the boxes randomly towards any wall, your robot should attempt to put red boxes towards the wall with the red marker, and green boxes to the green one, respectively. (See Assessment brief for further details).
- Hence here is how you populate the environment with boxes of red and green colours, respectively:
- to put e.g. 10 green boxes, you can run
ros2 run uol_tidybot generate_objects --ros-args -p red:=false -p n_objects:=10
. As before,n_objects
can be set to the number you like, the ROS parameterred
is a boolean, indicating if red boxes should be generated. Green is the default (red
isfalse
). - consequently, this is the command to generate 10 red boxes:
ros2 run uol_tidybot generate_objects --ros-args -p red:=true -p n_objects:=10
. - There are more parameters then
n_objects
andred
you could play with, but these 2 will do well in general. If you are really keen, you can find others documented in the source code ofspawn_objects.py
and play with them.
- to put e.g. 10 green boxes, you can run
- Here a view of and example pf a full level 3 challenge of 10 red and 10 green boxes, with
rviz2
showing some sensor streams:
You have a lot of freedom when showing your work on the real robots. Generally, the same complexity levels apply, but you can vary the number of objects, walls and obstacles more freely. You are allowed to explore how your implementation deals with different challenges, and make sure that in your short video you highlight the performance in response to the respective environment. Using the real Limo Robot is your starting point.
Copyright by Lincoln Centre for Autonomous Systems
-
CMP3103
- Using the Docker Image
- Week 1
- Week 2
- Week 3
- Week 4
- Week 5: Start working towards Coursework 2425
- Week 6: Enhancement Week, no timetabled sessions
- Week 7 and beyond
- Using the real Limo Robot