日本語 | English
Last updated:2022/12/29
This repository contains information for participants of the Automated AI Challenge 2022 (Simulation), including the procedure of building the development environment, competition rules, and other information.
This competition will use autonomous driving software Autoware.universe and a self-driving vehicles simulator AWSIM, unlike the 3rd Automated Driving AI Challenge held in 2021. Please follow the steps below to build your environment and participate in the competition.
See RULE_en.md for a detailed explanation of the tournament rules.
This competition is divided into a "Challenge Course" for beginners and an "Advanced Course" for experts. Participants will be exposed to both courses and will be asked to make a final course selection based on their own skill level.
The online scoring environment allows submissions to both the Advanced and Challenge Courses, but you will need to delete your previous submitted scores when switching courses.
We recommend that you use the following system requirements in this tournament.
OS: Ubuntu 20.04
CPU: Intel Corei7 (8 cores) or higher
GPU: NVIDIA Geforce RTX 3080 (VRAM 12 GB) or higher
Memory: 32 GB or more
Storage: SSD 30 GB or higher
If you cannot prepare a PC that meets the above specifications, please refer to the "For participants with two PCs" specifications below.
OS: Ubuntu 20.04
CPU: Intel Corei7 (8 cores) or higher
GPU: NVIDIA Geforce GTX 1080 or higher
Memory: 16 GB or higher
Storage: SSD 10 GB or higher
For more information, click here .
OS: Ubuntu 20.04 or Windows 10
CPU: Intel Corei7 (6 cores and 12 threads) or higher
GPU: NVIDIA Geforce RTX 2080 Ti or higher
For more information, click here .
※PC should be on the same network. If that, you can use topic communication without additional settings. In the unlikely event that topic communication is not possible, please deactivate the firewall or review the rules.
- Add the repository.
sudo add-apt-repository ppa:graphics-drivers/ppa
- Update the package list.
sudo apt update
- Install the driver using ubuntu-drivers.
sudo ubuntu-drivers autoinstall
- Restart your system and verify the successful installation of the driver.
nvidia-smi
- Update the package list.
sudo apt update
- Install libvulkan1.
sudo apt install libvulkan1
1. Download and unzip the executable of the course for the competition.
・Tutorial: click here
2. Change "aichallenge_tutorial_ubuntu.x86_64" permissions as shown below:
3. Double-click the file to start AWSIM.
4. You will see the AWSIM window:
1. Download and unzip the executable of the course for the competition.
・Tutorial: click here
2. Double-click the file to start AWSIM.
3. You will see the AWSIM window:
We recommend that you use the Docker image of Autoware (using CUDA) for the competition.
Please install the following:
- docker
- rocker
- A tool to use Rviz, rqt, and other GUI application in Docker containers.
- Nvidia Container Toolkit
- git lfs
- ROS2(Confirmed Operation:Galactic)
Docker image is [autoware(fc50327ec926d5c9a04d385581f102a418af0403)](https://github.com/autowarefoundation/autoware/commit/fc50327ec926 d5c9a04d385581f102a418af0403) with the following applied.
- fix(pointcloud_preprocessor): add missed target dependency #2101 with the following fixes applied.
- Remove tier4_*_launch package
- Moved to aichallenge_submit, if you want to change autoware's launch, config file, please edit here.
- Pull the Docker image using docker pull.
docker pull ghcr.io/automotiveaichallenge/aichallenge2022-sim/autoware-universe-cuda:3.1
※If the above method takes a long time or times out, please use the following command.
Please use the following command, as we have placed a tar file of the images at here.
docker load < aichallenge2022_sim_autoware_v3.1.tar.gz
- Get the data for the competition.
sudo apt install -y git-lfs
git lfs clone https://github.com/AutomotiveAIChallenge/aichallenge2022-sim
- Start rocker.
cd ./aichallenge2022-sim
rocker --nvidia --x11 --user --net host --privileged --volume autoware:/aichallenge -- ghcr.io/automotiveaichallenge/aichallenge2022-sim/autoware-universe-cuda:3.1
We provide the following ROS2 package in autoware/aichallenge_ws/src
as a sample code to be used as a base in this repository.
- aichallenge_launch
- Contains the main launch file
aichallenge.launch.xml
. All ROS2 nodes are launched from this launch file.
- Contains the main launch file
- aichallenge_eval
- Package for score calculation.
- aichallenge_score_msgs
- Contains message definitions.
- aichallenge_submit
- The contents of this directory may be freely modified.
- All ROS2 packages implemented by participants should be placed in this directory, as only the contents of this directory will be submitted at the time of submission. The following packages are included in the distribution phase
- aichallenge_submit_launch
- Since
aichallenge_submit_launch.launch.xml
is called from the original launch fileaichallenge.launch.xml
, so please modify this launch file so that the ROS2 node in which you are implemented will be launched.
- Since
- sample_code_cpp
- This is a sample automatic run implementation.
- obstacle_stop_planner_custom
- Fixes a problem with false detection of obstacles from obstacle_stop_planner in autoware.universe.
- tier4_*_launch
- This is a copy of autoware's launch file, partially edited. autoware's tier4_*_launch has been deleted, so be sure to leave this one in aichallenge_submit.
- It has been modified to call obstacle_stop_planner_custom instead of obstacle_stop_planner.
# In the Rocker container
cd /aichallenge/aichallenge_ws
rosdep update
rosdep install -y -r -i --from-paths src --ignore-src --rosdistro $ROS_DISTRO
colcon build
Please place the ROS2 packages you have created under aichallenge_ws/src/aichallenge_submit
so that they can be built using the above procedure.
# In the Rocker container
source /aichallenge/aichallenge_ws/install/setup.bash
ros2 launch aichallenge_launch aichallenge.launch.xml
At this point, the setup and execution on the Autoware side is complete. If the setup was successful, rviz will display a point cloud map.
This section describes how to check the operation using Autoware and AWSIM.
- Start AWSIM.
- Start Autoware.
cd /aichallenge
ros2 launch autoware_launch e2e_simulator.launch.xml vehicle_model:=sample_vehicle sensor_model:=awsim_sensor_kit map_path:=nishishinjuku_autoware_map
- You will see the Rviz2 window:
※For how to use Autoware, refer to the official documentation
- Click "Panels" -> "Add new panel" from the Panel in the Rviz2 tab and add AutowareStatePanel.
- You can see that self-location estimation is working.
- Note that in some cases, you may have to select 2D Pose Estimate in the tab and drag the actual position of the vehicle.
- Select 2D Goal Pose in the tab and specify the goal position by dragging.
- You can see that the route is displayed and "WAITING FOR ENGAGE" status as shown below (it can take several minutes to run):
- Press Engage button, you can see that self-driving started.
Please refer to RULE_en.md for the time acquisition method.
To calculate the score, only the package aichallenge_submit
is submitted from the web page of the online evaluation environment for automatic scoring.
After submission, the online evaluation environment uses the scripts under evaluation/
to perform the following steps.
The uploaded aichallenge_submit.tar.gz
will be placed under evaluation/
.
The evaluation/build.sh
will be executed to create the docker image defined in evaluation/Dockerfile
. The procedure for creating this image is as follows
- extract the submitted
aichallenge_submit.tar.gz
to `/aichallenge/aichallenge_ws/src/aichallenge_submit - run
rosdep install
and `colcon build
simulator will be launched in the online evaluation environment and simulation will be started.
In the container, by executing evaluation/main.bash
, the following will be performed:
- start ROS2 nodes
- start of scenario
If executed in evaluation/run.sh
, the results (score.json) will be saved under evaluation/output
.
Compress the source code in aichallenge_submit
.
cd evaluation
sh create_submit_tar.sh
Make sure that a compressed file is generated in evaluation/aichallenge_submit.tar.gz
.
Before uploading to the online evaluation environment, please confirm that you can build and execute in a Docker container similar to the online environment using your local environment by following the steps below.
First, make sure the following files are located under evaluation/
.
aichallenge_submit.tar.gz
-aichallenge_submit.tar.gz`.
Next, build the docker image containing the aichallenge_submit
you created.
sh build.sh
Once the build is complete, launch the docker container with run.sh
and execute the scoring flow.
sh run.sh
Finally, check the scores output to evaluation/output/score.json
.
Upload the aichallenge_submit.tar.gz
created in (1) according to the instructions on the screen after logging in to the web page.
After the upload is finished, the source build and simulation will be executed in order.
- If it is successfully completed, the message
Scoring complete
will be displayed, and the time for each of the distribution and evaluation scenarios will be shown. The time of the last uploaded evaluation scenario will be used as the final time in the ranking. - Even if the scenario execution finishes successfully,
No result
will be displayed if no score is output due to launch failure, etc., orCheckpoint not passed
if all checkpoints have not been passed, and in any case, the time will not be used as the final time. - If the build fails,
Build error
is displayed. Please reconfirm that you can build the Docker image by following the steps (1) and (2). - If the simulator fails to run, you will see
Simulator error
. In this case, there may be an internal error on the server side, so please upload the image again. If the error message is displayed repeatedly, please contact us. - The grading process will be performed 5 times per submission, and the result will be determined by the average of the 5 times.
Please note that you will not be able to upload new sources while the grading process is in progress. Uploading is limited to 3 times a day and will be reset at midnight Japan time.
When there are updates on GitHub, we will make a new comment on the issue at the following URL. Please SUBSCRIBE to this issue to be notified of updates (please turn on notifications)#1.
If you have questions about the competition or repository contents, please submit an issue on GitHub. You can ask questions in either English or Japanese.
Questions must be directly related to the competition. We will not be able to answer questions regarding the use of the software.
Please close an issue when resolved.
We generally reply to questions within two business days. Please note that depending on the contents of your questions, it may take longer than two business days to answer.
For inquiries regarding an account of an online simulator, for example, if you cannot log in to the online simulator, please contact us.
email:info-ai@jsae.or.jp