Transcript
This transcript was autogenerated. To make changes, submit a PR.
Hello, everyone.
This is Rajesh Kamisetty.
I am a lead software engineer at S& P Global.
I've been with S& P Global for almost three years now, and I've been in
the data field for almost like 14 years, and my key passion is about
using data to create insights.
I have been doing this for many years and for many companies, specifically,
in the past, I have worked for, retail industry like Konagrafus, Cisco Foods,
and pharmaceutical industry as well.
now at S& P, I'm a lead data slash architect or engineer.
My main goal is here is to provide insights for our customers.
as most of S& P is a, very big organization, S& P being said stand and
pull, we do have ETFs, we do have ratings, we do rate organizations, and, we have,
a group called Market Intelligence, where we provide data sets to consumers.
So I am a, I am part of that organization.
my role is there to harness the tons and tons of data and
provide insights to customers.
So that is my main, role there.
In this role, my main focus will be always optimizing, the outputs
and insights provided to customers.
today's topic, is a topic which everybody Has know about this.
Like autonomous vehicles, right?
Everybody talks about Tesla, right?
So a lot of people, know Tesla and the car company.
And they, this is the, is in positive race for the autonomous vehicle race.
With its own future as FSD.
fully self driving.
basically, I want you guys to, to give an idea about how these autonomous vehicles
work and in addition to that, how AI plays a role in autonomous vehicles.
So, I'll explain the basics of autonomous vehicles, what technology
they use and what is the impact AI as artificial intelligence,
specifically, neural networking plays a role in autonomous vehicles.
So stay buckled.
As I mentioned, we'll be talking about autonomous vehicles.
My main key area about, my lecture today would be focusing
on the safety side of it.
And, these are the few topics which I will be discussing today.
What are the, technologies like perception systems, which we
use in autonomous vehicles?
how is the integration between sensors and data is handled?
Thank you.
And, just an example of, technologies used by the companies like Tesla and Waymo.
And then how AI is used in there to drive the decision making.
And also relate, how the fail safe mechanism, works and what is the future
path for these, autonomous vehicles.
And especially how AI is going to, shape the autonomous vehicle futures.
And then also we will discuss a little bit on safety standards and delivery,
and then we'll come to conclusion.
The reason for me being picking up this topic is, very inclusion
because, the autonomous call vehicles is pretty much like the driverless,
driverless, automated, these softwares or technologies has, created so much.
Uh.
excitement in that, general public.
so if you see today, AI is key, a playing a key role in enhancing the safety
and, the self-driving capabilities.
And also if you look at, the amount of data, this, that the AI models
are crossing or like it can of data.
So if you say, As of today, the data is processed per second is almost 1.
5 terabytes, like sensor data at an hour, for an hour.
So basically, the autonomous vehicles process this sensor data and along
with other information, with built in AI models, they are making decisions.
which is an exciting thing.
So I just want to dig deeper into this technology and the key factor which
made me pick this topic is like, what is the, if you look at the number of
the percentage, the percentage of this effectiveness is like, for example, if
you see that the numbers say that 37 percent has reduced when after the, in
collisions, 37 percent of collisions got reduced when compared to human drivers.
I'm not saying autonomous vehicles are best, but it's just a statistical number
where there is a lot of decreasing collisions when compared to human drivers.
so that is the reason I picked this topic.
And we'll dig deeper and know more about what are the technologies and how
these autonom autonomous vehicles work.
As I mentioned, we would be talking about few components
of these autonomous vehicles.
So all these component in current world are integrated with AI and
they are providing better results.
So the main components in these autonomous vehicles are like sensor fusion.
So what are the sensor fusions, right?
These are the sensors which are.
assembled across the autonomous vehicle, which provide a 360 degree,
environment of this autonomous vehicle.
And, and creates an awareness and maps the vehicle accordingly.
So it plays, there are so many sensors, right?
360 degrees.
We can see hell number of sensors where this, where all
these sensors are integrated.
And the data is, captured and provided to the, software in the vehicle and,
and the decision is made, decisions are made based out of these sensors,
what the sensor has captured.
and the other technology is LiDAR.
LIDAR stands for Light Detection and Ranging.
It is one of the key, component or like technology used in autonomous vehicles.
Usually what it does is based out of, what do you call?
It emits laser light, and captures how far is the vehicle and what is the
depth and all that kind of information.
And it covers 360 degrees and it does a 128 layer scanning and
its range goes across 250 meters.
And the next piece is like cameras.
So a lot of companies like pretty much if you talk about Autonomous
vehicle the big company we get into our mind is like tesla.
So tesla mainly relies on cameras So so they use high end cameras Which like
the sports cameras which capture like more than 120 frames per second and
all these The high resolution and the, and the, and high resolution pictures
are captured and the neural networks in the software or the, the software are
captured and, sorry, the neural networks in the software process this technology,
process, sorry, use it to process these images and then the decisions are made.
And then the other piece is radars.
Radars also play a key role.
So it detects, it detects, vehicles 300 meters.
And, these are also, these are very critical for autonomous vehicles.
And usually the specs around these vehicles are like, the specs around
these key, Key components are listed here just for your reference.
So FOV stands for front view and horizontal scanning is like 360 degrees
for the lighter and cameras like for low light they use infrared and stuff
and radar like 300 meters, accuracy.
The accuracy is like plus or minus 0.
1 meter, which is really great.
And the angle is 0.
1, which is really great when compared to components we are talking about.
Next we'll talk about, the technology, as we spoke about, the technology,
I'll talk about the companies, right?
When we talk about AVs, the biggest company we get to get remembered
on, which crosses our mind is Tesla.
specifically talking about Tesla, it comes to, FSD, full self driving.
And other companies like to buy more.
Uber Mobile.
I, but I just pick two examples here to just give an idea how
Tesla is different from others.
So Tesla is completely a vision based approach, as I mentioned
in the past, in the prior minute.
So it is all about ca capturing the pictures around the, the vehicle
and processing the, the pictures and, and providing the, output,
like decision making to the.
The driver, sorry, the autonomous vehicle.
and so the main goal of Tesla, Tesla is okay, how the better the picture
quality is and how many pictures it is taking and how fast it is,
processing and providing a decision.
So Tesla has a huge fleet of, vehicles.
So it has so much of bigger data set, when compared to other, other, other
companies so it has a What I would say is like upper hand one comparing to
them the sample data, very processing uses the decisions and the main goal
for tesla to focus on this is to It also comes to the cost lighter is a little
bit expensive when compared to cameras So I think like some people feel okay to
save money cost and lighter technology, Tesla is using more of cameras.
So that's one view of thinking it.
When, when coming to Waymo, Waymo uses lighter radars and cameras.
I think a lot of people have seen the Waymo cars on the streets.
It's like a, on the top there is a some kind of a slider running across and
there are a couple of cameras by the Two doors and a lot of stuff around the car.
so what lidar, what ymo is like, it uses lidar, it rare, it uses
radars and uses cameras to create a comprehensive perceptive system.
And based on this, it has, it defines it, it creates a high definition map,
and then precise vehicle position.
So with this campaign, with machine learning and ai, It gives, it
gives a better driving experience.
So many people know that YMO is not fully used in the market, but yeah, what I would
say is it's also in, it's also, getting, trying to getting on par with SSD.
If you look at the stats, like this amount of data, which is being processed by
this, I would say that vehicle is amazing.
For example, the neural networks which are being built to serve the
autonomous vehicle process almost 250, 000 scenarios per iteration.
And the vehicle detection as of today, as per 98 percent accuracy.
Pedestrian detection is 95%.
There is a little bit of, Less detection when compared to, vehicles
to pedestrians because pedestrians are moving objects like moving people.
So they come, they may suddenly cross the street or they suddenly pop in between.
So that is the reason I guess, there is a little bit less accuracy
when compared to pedestrians.
And if you look at the response time, it is 300 milliseconds,
which is really great.
And, which is really 500 percent or like 250 percent more
than what regular humans did.
So if you're a regular human, you're driving, if somebody, if the average
response time is 1, 500 milliseconds.
But the autonomous vehicles like Tesla or Vimo or whatever, Uber or Mobile line.
So all these companies, their response time is Approximately
around 300 milliseconds.
So that's one key factor, which make me feel like this is an exciting technology.
as I said in the, in first few minutes, the, there is a
reduction in the collisions when compared to humans than Aries.
So that this is the main reason, right?
So it's all about reaction time when you're driving a car or any vehicle.
So how fast you react is what it is, which saves you from the collision.
And then it's always, having the data fed and keep on processing the data
and the data, if you see, it has 50, 000 hours of data and it process all
this data for enhanced safety futures.
And here are some of the stats, to prove this technology is a viable technology
and, It has a clear path forward.
So if you see, as I mentioned in the past, 37 percent reduction
when compared to human drivers.
Pedestrian safety, 90 percent effectiveness.
Detection accuracy is 98 95%.
And reliability is like top notch, 99.
99%.
and if you see at the meantime between failures, MTBF, it's a stat
where how often the system fails.
It's like 50, 000 hours.
so it is very, it's very, as I said, it's an intrusive technology.
So I see there is a future going to be very bright in autonomous vehicles.
So as I mentioned, what is the future, right?
So how do we improve safety?
how do we bring that 100%?
Nothing will be 100 percent for sure, but how do we bring that 100%?
safety as much as possible.
So how AI artificial intelligence is helping this is if you look at
this, neural processing units, right?
As of today, we are close to 1000 tops, terra operations per second.
which is really good.
And, and these, these number of operations, makes, the output, like to
say, this shall pass it by the A modems be very, accurate as much as possible.
Another thing is like the latency between the communication between sensors or
cameras are being decreased and we are looking at five milliseconds of latency.
Then when compared to, what you call the reaction time of humans, which
is like almost 1500 milliseconds.
But, as we mentioned in the prior, the reaction time of autonomous
vehicles is about 300 milliseconds.
Okay.
the more the latency, the more problem we are in.
the main goal for the autonomous vehicles, when they are collecting
data, when they are passing the information, for communication to
rest of the ecosystem in autonomous vehicle, is trying to be reduced.
today they are at, they are aiming to get to 5 milliseconds or less.
And the other thing is like with neural networks, AI, machine learning, and
quantum computing, which is like the next gen of computing, will improve all
these processing and make it better.
And then, with the more amount of data it is collecting every
day, these vehicles roam across and collect all this data, right?
And you are increasing data, your data sample, enormously.
So It will continuously, what you call, feedback mechanism is there so that your
algorithms are trained very well and they are designed and deployed very well so
that the decisions which are made by the autonomous vehicles are close to accurate.
And many people, there are some scenarios where people think about like, how do you
react when it comes to, so if there is a scenario where this vehicle is to, to a
collision with a pedestrian or other car.
So these are the ethical concerns which people have.
But eventually, once, once the technology is fully approved and completely used,
we will see where that ethical concern, how that ethical concern is handled.
But I am not going to talk about the ethical concerns.
But we are just talking about the technologies here.
So as we talked about ethics a little bit, there are also safety concerns, standards,
and compliance and regulatory, right?
So there are a couple of few of the regulators which are in place to
make sure they are compliant with.
So for example, ISO 26262 compliance with functional safety standards,
that is, which defines how compliant they should be, and there is also,
clarity of the client vultures and something called SOTAR.
guaranteeing safety under operational conditions, and these are a few
organizations which try to impose, some kind of complaints and regulatory
things, and there are also protocols, how many miles of real world driving
this autonomous vehicle has done, how many simulations, like what kind of,
Simulations it's went through and how many edge case scenarios it went through,
covering like adverse, adverse terrains.
It could be snow, fog, or rain, or hail, or whatnot.
And the other metrics which they key focuses on, like, how far the
radar is detecting, what is the latency, as I mentioned, right?
The latency at an average of 300 milliseconds, but they substantiate
they're trying to get to 100 milliseconds and reliability.
Like what will happen if the system gives up on me.
So that is very important, right?
You cannot go stuck in the middle of the road So the reliability should
end up time should be almost 99.
99 percent Which is close to 100 percent once that is achieved.
I feel like if these three metrics like if you get to 100 million 100 millisecond
latency, that's a big achievement And reliability close to 100 percent is also
achievement and currently, governments are following, I see there is a good path
going forward towards autonomous vehicles.
So, we can see much more development in next few years.
In coming to conclusion, I just, I want to mention the fact that
AI is helping autonomous vehicles.
Like crazy.
So AI is driving autonomous vehicles to achieve what it
is supposed to intend to do.
So if you look at the significant strides, by improving safety, reducing
collision, identifying pedestrians.
AI plays a key role in this.
especially neural networks.
very process like closely to thousand tops of data, and vehicle to vehicle
communication, bringing the latency to do less than five milliseconds
and making sure the reliability, 99%, 99.9% of the key, key concepts of the
success for the autonomous vehicles.
So hopefully we will get there and probably within few years we need not
drive anymore and just sit and relax.
Thank you very much.