Transcript
This transcript was autogenerated. To make changes, submit a PR.
Hi everyone and welcome to my session. It is called large language
models for better developer learning of your product.
My name is Babur. I'm a Devop advocate at company
called Atacama. Also I am ex Microsoft.
I have been working in Devrel over five
years. I'm also active
open source contributor to Apache Software foundation.
If you would like to have a chat or connect me on LinkedIn
Skord and make a connection, I would be
more than happy to have an online session
to answer your question and address about Devrel and
AI in Devrel how we are leveraging AI in Devrel and
I have been also speaking in many events
last year and this year in 2024 last year
I've been delivered sort of different in person and
online sessions at various conferences.
You can see the pictures depicted from some previous
sessions. So today's agenda is about
how as we know that developers we
are highly interested in AI technologies.
I assume that more than 60% of the developers globally
already working on or learning about AI
and generative AI. In this session we'll discuss how the developer relations
meets generative AI this year and how these
changes AI changes effecting on
dev rail some of the concentration you need to take
into consideration when you are integrating AI tools
into your Devrel strategy. I will also walk you through these
tools and we will try to address the main question is the AI
wave currently going on is a threat
or an opportunity for Devrel teams nowadays
growing so the bunch of records such as legal papers,
academic studies, news, technical guides and books
can be automated using AI. Large language models
can be applied, as we can see, to various use cases and industries.
For example, OpenAI is GPT four, a powerful LLM
used for a wide range of NLP tasks and chart GPTO.
We know that set record the fastest growing user base in
January 2023 prove that large language
models are here to stay with us longer. As I am also a
developer and I have been working with Python
a long time, I decided also to build my own applications
using new AI technologies. What about quickly
summarizing your content? Get the information you need
in real time from private large unstructured documents
in your Dropbox. For example, the same tool can be used with
OneDrive or Google Drive. I decided to make my job
easier when I'm creating invoices or summarizing my
content using my own private tool that
connects a Dropbox and analyze my documents.
Or next. This is another app to
find real time discounts sales prices
from various online markets around the world. I connected to
real time Amazon APIs to fetch some discounts,
deals, coupons, information and then it makes me easier
to find out these discounts I am interested in.
And you can also advance this feature by adding some alerts.
When there is some discounts you can get some discount information from
the AI application. With another application you can think
about LLM app that provides real time alerts for
significant document change or updates. Let's say you are
working with marketing campaigns and this system can monitor various
aspects such as content chains, campaign performance metrics,
or the audience engagement. Real time alerts enables marketing
teams to respond quickly to changes to make sure
that these campaigns remain on track. Or the last one
I would like to demo one of the applications is effortlessly
extracting and organizing unstructured data from PDF's,
Docs or other unsustainable information
more into SQL tables in real time. This example,
as you can see, extracts data from unstructured files and
stores into PostgreSQL table. Also it
transforms a user query into SQL query which is then executed
on postgreSQL tables. So here as you can
see, we are progressing quite fast based on
the four applications I was able to build and it takes
me for each building application like one or 2 hours and it's
already ready to use. Let me know
in the comments if you find this application useful, I will
be more than happy to provide the source code for them and then you can
also give a try yourself. So let's switch our attention back
to the developer relations as we call Schwarz
dev rail. Right, developer relations, exactly what
it means as a marketing policy that
plutoizes some relationship with developers.
And for those who doesn't know in depth, what is a dev?
What are the Devor publicists do? For example, Devrel can act
as a bridge between your company and its
technical audience. Our primary goal is to build a
strong, engaged community around the product or
any developer technology. We provide education,
support and we try to foster the engagement and
we try to simplify the developers learning, experience and challenges they
face. And some people ask like when exactly Devil
can be helpful for the company? Let's assume that
if you're building a developer oriented product,
that's where Devwell can help. Assume that marketing
team cannot reach the right audience when they are demanding
highly technical content and they actively
avoid the usual sales and marketing channels. Because the
rails or the robot is a technical person, they are
more like expertise in providing some examples
and sending messages for your audience. Or another thing is
maybe product managers are struggling to understand the new industry
trends or without being expert in this domain.
Or sometimes engineering teams are super busy in building
a product and maybe they might not have a time or
skills set to do everything that Devrel does. So. And from
that perspective, our devrel divides into four
pillars. Mainly like we do developer
marketing. We try to understand who we are targeting,
what kind of developers for a product, and we make sure that they
have information and tools to make the decision.
Also developer enablement, developer advocacy
and the community. Also our part of responsibility and
as you can see developer relations after it
introduced, it enabled also marketing
community and other things. For example, we are
creating and maintaining always the process
where our developers can have a common goal relations.
Developer relations enable developer education and
foster developer experience and support and developer success.
As you can see for the developer education, we create sometimes
documentation, tutorials, video, videos and guides
for developer experience. We improve always API design and
SDK experience. Also we probably get
some feedback always while they are using our
SDKs. And we also support the developer success once
we know what the dev rail is. And now let's bring
your attention and how AI changes developer relations nowadays,
AI I think will be the accelerator of pre existing
developer relations trends. For example, AI assistant
documentation for production use cases.
For example, AI can act as a copilot
for us, taking our routine and boilerplate tasks.
In this context, AI can help speed up the documentation process
by inspecting our APIs and the code.
Also, it's helping us nowadays creating code samples
and also supporting in real time some developer requests.
You don't have to answer yourself. An AI chatbot can answer
these questions. It's also in the support context.
For example, a well implemented bot always
can handle simple support requests. Or or if you have
an AI chatbot developers, instead of going through the enormous documents
docs files, they can also search
in the search bar specific information I
have started using chat GPT for example, generate short
descriptions of new articles, new variation of titles from
my blog post, even some article outlines why I
still have to guide and fact check the AI machine
because it is saving at least a few hours of my work. But at least
I am one who is fixing and advancing
the AI solutions. Imagine being able to summarize community posts,
discord channels or slack channels. I can almost guarantee
we will begin seeing soon like community copilots that
can help to coordinate between different channels and outlets.
However, as you can see, generative AI is creative,
but not as creative as we are humans. It also doesn't
do well with personal experience or realistic examples,
especially initiate topics like ours data processing pipelines
in Python. While you can try to push to do so. It doesn't
always understand real life struggles with replicating
human experiences and it doesn't do well with extending
piece of context. It always struggling to go deeper topics
so you will often find repeating the same level
of definition of your topic while you are asking questions.
Here's a table of things how the AI is helping me
nowadays. Let's say for example last week I pushed
the sample report to GitHub showing how to build real time data processing
pipelines in Python. This also involved with some of the number
one use cases just
above since I was using chartGpt while
writing the code because the simple scenarios nowadays GPT
can create somewhat close
to reality. The most powerful
thing I think the AI you can feel feed
some context documentation information to
your AI and provide some searching functionality,
let's say on your doc page. This is like
how we can approach in Devrel to help the developers
to find easily the information.
We already seen some work being done with support bots
or askdocs functionality for our device
documentation or the API descriptions. Let's have
a look at building AI chatbot that can help the people
understand your developer documentation more easily and answers
user questions. They can find answers quickly without
needing human intervention, right? Which might be
speeding up the workflows and improves
overall developer satisfaction with the
pathway team. What we did, we integrated this ask me
bot into discord server where our pathway community
they are. You can ask questions
about your specific information to find
out easily the documentation details or code
samples. This definitely allows developers
to specific question from a prompt. It saves time
from parsing the pages of the documentation or contacting
developer relations represented directly from the
codewise of simple if you navigate our repository
open source repository pathway, you can find many examples.
As I demonstrated at the beginning, one of the examples
also looks quite simple, how to connect the
AI chatbot to our documentation. It's very easy.
As you can see we are just connecting to docs
data by using built in connectors and we have some
libraries to make it easy to calculate vector
embeddings by chunking the big amount of data into chunks
and feed this data into to the discord servers in
real time. And let's say it's called
also differently rug approach retrieval argument that the
approach as you can see in the diagram, it highlights the common
architecture for it. While you are ingesting some
data from APIs or files databases,
what you do, you just build a data pipeline that processes
data, transforms it and calculate some embeddings.
As we know vector embeddings and also
after calculating vector embedding, it stores for fast
retrieval to the vector storage like a vector database. And then
you can start to build your application on
the top of it. That provides some search bar
with the backend service that accepts user queries
with questions. And then it does again like from the
query calculates vector embeddings and from the vector embeddings
it finds relative clause vectors
we stored in the previous step in vector databases. That's how
the common rag approach works. But if you are trying to
build your application or feed the context,
custom context to AI application, not easy enough, you need to
know nowadays a lot of technologies and frameworks,
sometimes it's, well, confusing. And here we
can see a lot of technologies nowadays building around
AI applications such as frameworks,
APIs, foil and so on. And while
with a team of pathway when
I was working in the past, we tried to build our own
also the framework to help the developers to reduce
this job. I mean you don't have to know all the technologies and tools to
build applications. Our LNM app provided
by Pathway, it's fully open source where you can replace
all these technologies and knowledge by using a single application.
Some of the simple application we introduced our
develop experiences by using the pathway open source technology.
For example, it means we you can also reduce
go to market time. It's a lower cost because
open source is for free and it says highly security
like a context where you can also run it on
the top of your custom llms without using public
LLM provider. And of course it's using also under the
hood pathway technology where you can connect to
any real time data sources.
Nowadays as open source, we are supporting open
following sources like it can structure
data, structured semi structured live data. You can
ingest data from your Docs page, you can ingest
your data from PowerPoint, any PDF's or slack channels.
To analyze your developer experience better, here's a list
of key features LLM application offers.
For example, it can index real time documents
without using any vector storage or vector
databases in real time. It means it also reduces infrastructure
overhead. From the architectural perspective,
it simplifies much nowadays. Use the emerging
technologies with lnms. As you can see,
simple architecture we are demonstrating everything is managed
by the single framework where you don't have to
know how these things internally works with a shorter lines
of code. You are already building some developer
experiences for this discord server. For example, you can connect to
your internal local files or external maybe
GitHub markdown files to ingest the data and we can
connect to multiple providers
LLM providers publicly available like OpenAI,
somewhere from Facebook or Google. So other
stuff is fully managed like vector indexing,
chunking the information and feeding this data into
the discord servers. There are still some challenges to
make these LLM applications or AI
chat post to the production level. It's easy to
provide some examples. We are still testing our skill server applications.
The issues we have seen like besides
of natural languages. Sometimes we are facing hallucinations
and there are also constant latency. For example,
you never know when you can get answer from
OpenAI because they don't have SLA's. There's no average time
responding right. You can expect the real
response from the AI OpenAI on time.
Other things problems is offline evaluation. Of course, when you're
writing for example unit tests or testing in the
documentation or code samples for correctness,
it's impossible to evaluate it without connecting
to public openaPI servers. And sometimes it's
open. The LLM providers,
the LLMs, they respond differently at
each request and it's impossible to test and
make sure that everything is working fine.
As you have seen, it's easy to make something cool with lnms, but it's
very hard to make something production ready with them.
So if you interested in observing our
open source framework in Python for building applications,
come with square code. It will bring you to our source code.
I have shown you shown already to you and you can try to
run discord simple application. Maybe it might help you
to run your own discord
chat AI bot response based on the documentation you have.
Here's the takeaways from my sessions. As you
can see, integrating the AI into software
development and Devrel is better as
it's accelerating our existing productivity rather
than it's completely disrupting us and
AI representing both significant opportunities and
challenges for Devrel. While it can automate also tasks
like technical documentation support, that leads to
increased efficiency and potentially it
reaches developers experience. And I
think also AI will enable new class of users
like creators who can tap into productivity
that previously only available to software engineers.
Non software engineers also can do some engineering
works by using low code and no code trends
and start also think about AI integrations for your
developer product product. Thank you for attending
my session. If you would like to have questions,
feel free to ask me on LinkedIn or leave your question
in the chat. I will be more than happy to address them. Thanks. Take care.
Bye.