We will provide each team with a middleware protocol that will allow you to control the car – in a simulation (Unity), on a toy car – and even on a real vehicle.
We also provide a driving simulator as a Windows executable to which you can connect from within your application via our interface. The same applies for the Jetson car where the API provides the access to the vehicle control and camera.
It comes as a Python module to work within the autonomous challenge directly on the vehicle’s system and as a C# wrapper to use in the Unity engine to write a remote control app for Android devices. It’s also possible to write an Android app without Unity by using the Java-Library.
Nvidia Drive PX Hardware Platform
We will also supply a compute Nvidia Drive PX hardware platform we are using for self-driving development (Jetson toy car and real car). This comes with camera input, an advanced system for video analysis, and an object recognition system.
CAN-interface of a real car
For the CAN hack you’ll get a real car to work on with access to a CAN-interface. We’ll supply a basic hardware-platform with open source tools, to sniff on the bus while we are on the pitch / test venue.
Key-words and Links:
- µControllers, Arduino, Linux, C/C++, Python
- Data visualization, UI/UX
- Car control, Autonomous driving, Machine Learning
- Nvidia Drive PX2
- OpenCV getting started code, lane detection
- Nivdia end-to-end learning
- OpenCV TX2 onboard camera
- Intro Caffe with TX2
- Misc Jetson TX2