Transcript
This transcript was autogenerated. To make changes, submit a PR.
Good afternoon, everyone. I'm Raj. I'm excited
to be here today to talk about the fascinating and transformative
field in the automotive industry, which is advances
driver assistance systems and autonomous driving.
Let's dive into it right away. In this
presentation, we'll delve into the exciting world of adas and autonomous
driving. Think of adas as your car's superpower,
which enhances safety and makes driving lot easier.
We'll explore the key components that power these
technologies, the impressive functions they perform,
and how we are steering towards fully autonomous vehicles.
Beyond this tech talk, it's all about making our roads
safer and transportation more efficient. So let's
navigate through the advancements that is shaping the future of
driving. I'll talk about the core components
and functions of adas, the computing needs of adas,
and finally touch upon the evolution towards the autonomous
vehicles and the impact on road safety and transportation.
So what is adas and ad?
ADAS stands for advanced driver assistance systems.
It's a suite of technologies designed to enhance vehicle
safety and driving experience. From collision avoidance systems
to adaptive cruise control, ADAS represents a digital
copilot seamlessly integrating technology with driving.
And when it comes to autonomous driving, actually the vehicles they navigate
and operate without human input. This is
a shift from driver assist features to fully autonomous vehicles,
revolutionizing the concept of transportation.
Now, let's take a look at the brains and eyes behind
the magic of adas. Imagine your car being equipped
with superhero squad, each member playing a crucial
role. There are several members here, like radars,
cameras, ultrasonic sensors and lidars.
Let's see the functions of each of these members.
Consider a long range a radar. It scans the road ahead,
detect the objects from a distance, helping the car to
anticipate and react to potential dangers long before they become
an issue. And then short range radars. They have a
keen sense of proximity. It focuses on the immediate surroundings,
ensuring your car navigates through tight spaces,
handles parking, and keeps an eye on blind spots.
And cameras. Interpret traffic signs.
Recognizes lane markings and provide a real time visual feed,
offering eCarx a comprehensive view of the environment.
And there are infrared cameras which act like stealthy night vision experts.
These cameras see what the human eye can't in low light conditions,
ensuring safety even during night when there is not
so much light. And there are ultrasonic
sensors like these are like silent guardians.
They are the ones responsible for detecting obstacles in
close quarters, making sure your car doesn't inadvertently
bump into anything during low speed manures.
Last but not the least, lidar,
which is the precision mapper using laser beams,
Lidar creates a detailed 3d navigation of surroundings,
allowing the car to navigate complex environments and
understand the world with incredible accuracy.
Together, these components form the powerhouse of adas,
working seamlessly to make the driving experience safer and smarter.
It's like having a high tech team ensuring
you're in good hands every time you hit the road.
Here are a few examples of advanced driver assistance systems.
There's some ADAS features, basically. So for example,
front collision warning. It alerts the driver if a potential
collision with the vehicle ahead is detected. And there
is automatic emergency braking which automatically
applies the brakes if the driver doesn't respond in time.
And for example, there is a lane departure
warning which notifies the driver if the vehicle unintentionally
drifts out of its lane. And there is lane keeping
assist which can gently steer the vehicle back into its
lane to prevent unintentional lane departure. So you see
a number of ADAS functions which are divided into
safety package, in comfort package and in parking package.
So depending on which
kinds of features the car makers
choose, you will have those functions in
mid, premium or even entry kind
of vehicles. And maybe just to explain a couple
of functions more, there is for example
adaptive cruise control. This maintains the speed, but can automatically
adjust the speed based on the distance to the vehicles ahead.
It helps in maintaining a safe following distance in varying
traffic conditions. And there is blind spot detection, which alerts the
driver if there's a vehicle in the blind spot.
Typically with a visual or audible warning. It enhances
and it gives an awareness during lane changes.
And then there are park assist technologies and
automated parking assistant which basically
it steers the vehicle into a parking space,
providing guidance to the drivers. And there's
traffic sign recognition which uses cameras to identify
and interpret traffic signs displaying relevant information on the
dashboard. It helps drivers stay informed about speed
limits, stop signs and other road signs.
And there is something like cross traffic alert which warns
the drivers of approaching traffic from sides,
especially when backing out of parking spaces. It advances safety
in situations where visibility may be limited.
These examples, they showcase the diversity of ADAs features,
contributing to safer and more convenient
driving experiences.
Coming to the compute power which is required for ADAs and
Ad, I would like to discuss this compute in three dimensions.
The first of it is cpu or the scalar computing.
The second one is gpu or for both
three d and parallel capability. And certainly the
most important for ADAs and Ad which is the
computing for machine learning and inferencing offload. So let's
have a look at cpu load first so
today we see socs that reach something beyond 100
kd mips. There are x 86 socs from
multiple vendors which they offer like
something about over 500 ktmips and beyond we
have server socs which can deliver as much as 5000
ktmips. We can see quite a significant
dynamic range here. What's interesting is when
we start looking at ADAs and Ad, we see a higher level of
computational capacity for cpu, as much as 1000 kd
mips in the case of robotaxi class of applications.
On the gpu front, let's see what
options do we have silicon wise, we see arm igpus
which are the integrated gpus from half to about
five t flops. We see discrete gpus for pc,
mobile and desktop, ranging from 50 t flops
and then data center discrete gpus providing
even higher capability of t flops and other computational capacity.
So what does it look from an ADAS perspective? Even in ADAS
applications, we see a demand of gpu computing.
However, this isn't as much. Gpu requirements are ranging from
0.5 t flops, maybe something
like to one t flop. When we go to machine learning,
we see a pretty wide range of capabilities.
These are socs from five to 200 intake tops,
depending on the vendor in question. Data center is offering
like 500 intake tops and beyond. This is what is available in
the market today. Obviously, the market is responding rapidly
and aggressively to the increase in the capabilities for
ADas and ad. We need as much as something
like 1000 intake t flops,
a very high level of dynamic range. We can also combine
two or more socs to get the required amount of tops depending on the
application. So with such high compute socs
at our disposal, and more and more autograde socs which
are available in the industry, we are able to address the
needs of all autonomous vehicles, especially from
l three to l five autonomy vehicles.
Now let's see what kind
of chips are available with
some examples here I got some examples of chips
which we use for adas on autonomous driving.
So, cutting edge chipsets like this play a
pivotal role in meeting the computational demands,
which is essential for real time decision making and control.
As an example, horizon robotics.
They have an emphasis on edge computing. With them,
they excel in processing sensor data directly on
chip, particularly in computer vision and object recognition,
and then Nvidia or
in and Thor. Of course, the architecture stands out for its
powerful gpus, enabling high performance
parallel processing and simultaneous support for multiple
neural networks, which contributes to robust
perception capabilities in varied
driving scenarios. And then Texas instruments
like TitDA four series. They focus
on sensor fusion,
efficiently integrating data from diverse sensors
to provide a comprehensive understanding of
the vehicle's environment and then
mobili. For example, they specialize in vision based
solutions with its iq chips, excelling in computer
vision algorithms for applications like lane departure,
warning and collision avoidance. So these chipsets,
they collectively address the intricate needs of adas and autonomous
driving. Their optimized designs support
tasks such as sensor fusion, real time perception,
and decision making, which is crucial for enhancing both safety
and autonomous capabilities.
Now let's have a look into the software elements.
They can be broadly classified into, like fusion,
perception, planning, middleware, and components for
safety. So what are these components? The fusion
software component. It acts like a brain of adults,
merging data from diverse sensors such as cameras,
radars, lidars, and create a comprehensive
and accurate representation of vehicle surroundings.
Perception software, on the other hand, involves advanced
algorithms that interpret sensor data,
recognizing objects, pedestrians, and road signs.
This components is crucial for real time decision making,
providing the system with the ability to identify and
respond to dynamic changes in environment.
Middleware, with platforms like QNX and
Autosar, serves as the communication bridge between various hardware
and software components in the ADAS ecosystem. So,
QNX, it's a real
time operating system often used in automotive, which ensures
consistent and predictable performance.
And Autosar, it provides a standardized framework enabling
seamless collaboration between the different software modules which
we have in the system. So also we need to ensure
that these ADAS software components, they adhere
to rigorous safety standards, and this is usually done by the
NCAP certification, through the NCAP certification. So functional
safety, which encompasses standards like ISO 26262,
is paramount in the development of ADAS software,
which emphasizing the implementation of measures
to prevent or control to control system
failures that could compromise safety. So these
elements collectively form the robust software backbone
for adAs. Now, in the landscape of autonomous driving,
there are five levels of autonomy, from l zero to l five,
each representing a distinct step towards vehicles
that can navigate the roads without human intervention. So at level
zero, we have no automation where the driver is fully in
control. As we ascend through the levels, we witness the gradual
integration of automation, for example
from basic driver assistance features like some warning features at level one,
to driving features like adaptive cruise control in level
two, to conditional automation in level three,
where the vehicles can handle certain scenarios independently.
So reaching level five, the pinnacle of autonomy.
We envision fully self driving vehicles that
can operate in any environment without human input. So this
evolution reflects not just a technological progress,
but also a paradigm shift in how we perceive and experience
transportation. So before I close.
Let's have a look at the impact of Adas and ad on road
safety and transportation. ADAs emerge as
vigilant copilots, significantly enhancing road safety
through features like collision avoidance systems, lane departure warnings,
and adaptive cruise control. Simultaneously,
autonomous driving has a revolutionary impact on safety,
where vehicles are capable of swift, split second decision making,
reducing accidents, and enhancing traffic flow.
It also enhances accessibility and inclusivity in transportation,
opening new avenues for individuals with
disabilities and accommodating diverse needs.
The integration of these technologies brings
many environmental benefits to optimizing traffic
patterns and reducing fuel consumption
and emissions, thereby contributing to a greener and more
sustainable transportation ecosystem. So,
in conclusion, the combined influence of ADAs and AD
marks a revolutionary shift in the future of transportation,
from advanced safety features to
redefining accessibility and promoting environmental
sustainability. I hope the
talk was end of the presentation.
I hope this short talk about ADAs and ad has sparked interest
in some of you and helps you to further deep
dive on some of these topics. Thank you.
Have a nice day.