Direct Parameter Learning on BOSCH Motors



Chengzhi Wu


zu vergeben

Möglicher Beginn:

ab sofort


In the research project "AgiProbot" at the Karlsruhe Institute of Technology (KIT), apart from the real-world data, a special Blender addon has also been developed to generate synthetic data of various motor instances. With it, large datasets, including both image datasets and point cloud datasets, may be created for the needs of various computer vision tasks. Among all the tasks, a particular challenge is to learn parameters directly from the input, e.g. bolt positions, gear size, etc. This is more challenging compared to usual classification, detection, or segmentation tasks since it requires learning feature disentanglement inside the neural network processing.


In this work, a large synthetic dataset will be generated with the provided Blender addon. Explicit parameters of motors should be saved. The ground truth semantic labels of the generated motor images/point clouds should also be saved. With the synthetic dataset, different multi-modal input neural network architectures could be used for the parameter learning task. Possible architecture modifications may also be proposed for performance improvement. The dataset will be published as an open benchmark for parameter learning. Finally, with a possible transfer learning step, the trained network may be tested on real-world data.


  • Subject: computer science, mathematics, electrical engineering, applied physics with good programming skills
  • Willingness to familiarize yourself with new topics and enjoy bringing in your own ideas
  • Good English speaking and writing skills, ability to work independently and strong analytical skills
  • Good understanding of the basics of deep learning, experience of DL projects is a plus

We Offer

  • Intensive support and a pleasant working atmosphere in a creative team of motivated scientists
  • Possibility of a subsequent job as a research assistant in order to further deepen the knowledge acquired
  • Development of joint publications


Chengzhi Wu