Get your pass for only
£249
Taking a leaf from the recent advances in LLMs, it is important to have a pretext task to build a base model that can train unsupervised or with minimal supervision on large datasets. On top of this base model, task specific DMS models can be trained on significantly smaller datasets that have high quality labels. The typical unsupervised approach is to train an autoencoder neural network (or similar). While this works well and results in better performance than not having this step, in this work, we propose a better unsupervised training algorithm to train the base model neural network for DMS tasks. In addition to showing an improved performance of this model compared to using the vanilla autoencoder approach and the “train from scratch” approach, we also show the new method results in more interpretable models.
With exclusive editorials from Transport Canada and SAE; the ADAS Guide is free resource for our community. It gives a detailed overview of features in today’s road-going vehicles, categorized by OEM, alongside expert analysis.