Langchain with Golang
Video size:
Abstract
Langchain is a framework for LLM using with python or javascript, but now have a port to golang, where you can use every power of Go for create chats Bot, any more.
Summary
-
Langchango is a part or fork of LangChain. Can use vector, stars, properties, document loaders for Reggie agents, LMS, exchanges, link have integrations with OpenAI, Polyama, Mix, Row, Google AI. When used lengthango you can do integration on other APIs like OpenAI.
Transcript
This transcript was autogenerated. To make changes, submit a PR.
Hi everyone, today I will talk about length shingle.
First let's do it myself. My name is Alessio Majelista.
I live in Brazil and I am skilled in
node, JS, Python, Php, NGo.
What is Langchango? Langchango is a part or
fork of LangChain.
So this document is new
and not decomplete.
You can use linko with it lemma.
It's very simple. This simple
is here I'll open AI,
you have example. If you can use
it with OpenAI Mistro this is a new victory.
Have some agents memories
models about
where it says use local is not
completely documentation and in GitHub
you can see the preview of new victories
like Marita ki. This model from
Brazil individual is
some place I
don't know and
heavy memory adds accurations and
other features here perplexity.
So why can't use this for
using you can use vector,
stars, properties, document loaders for Reggie
agents, LMS, exchanges,
link have integrations with OpenAI,
Polyama, Mix, Row, Google AI, LMA,
CPp, LMA file, Nvidia,
Perplex and Whisper is coming for
let's write a code with langchango.
First you need the import some packages.
After this and my main method
you have the new instance of OpenAI.
For example I can use open AI and I
use call function. This function is very simple
because I set the syntax, my context
and a prompt. A simple prompt you can use
temperature for default is default
is six and
you can use the stopworks or not.
Default is empty. Around this
the first man to walk on the moon has nail
because the next work is Aristotle.
When used lengthango you can do integration on other APIs
like OpenAI schema we
have integration with Grok.
It's very very simple. I set model,
I set token and base rail.
The code is equal above
the difference is options parents.
Then I have new
OpenAI context call
methods. I run this.
The first main to walk on the mob has neo.
Okay I can use with envision two
when using Vg like an API you can't use
AMD. So if you use
Nvidia you can't use MD for rag.
Okay I set by Israel
model token run
this.
The output is here.
This example I use something
different like I use message content.
The type system is system
and shaft message type woman is user.
I use content generate content
sorry. And I set with maxtoken
and I use stream funk. This functions return
and show here.
So now I show for you when you can
use langchango with reg.
Now I will show for you how use language for
rag applications. How does it
work? First you need to load the document
of rpdfe or test file or
database. You do
split for this document save in
vector star do a
revade split or carry and
then use prompt send for lms.
It's very simple. Okay for example
you need to use some packages like vector store.
I'm using quidrant for example but you
can you you could use the pipe
or pages vector store and
agent chains document loader
images OpenAI
for example I use this OpenAI because I
want to use the Nvidia but Nvidia this
dot heavy images for this version of
ncingo schemas attach the
splitter and vector star hydrogen the
first step I need to get
my file this case PDF file and
create split for document.
For this I create the function PDF text around
this this function get
a PDF and split for docs
schemas okay I am
using the leaderpress PDF I run this I
have a first document is
here okay this
function get the content and send for kidrange cloud
I am set URL API key
collection name aimed if
I run this one time
and save one time if I run this
two time the save two time okay
for use with PDF text
and save in my store when
I release this message send this doc
for credential after this I
can use for carries I
run this and output it is how
I can search without
save again I create a new method I
remove this called store add
document so I create query
send for credit and get results.
My new method is here has star this
method not have add document so
here I can I
have the I can add documents
because I have added documents okay in
next function I
remove disclose so disclose
I get to only hidden instance and
return my vector store release then
my full code for this verizon
line shango version is 10
you need set model don't
use default model don't use MSD
default model because in this version having
default values I create a
new OpenAI I create my
new ember so apnei can't
have it. If I use any video I
can't use this I set my vector
star square this seven use
new function has stored because I
don't want to send again the
doc for quadrant I need only vectorstore
instance so my search
card what is sbn book
and send this for kidrant I
get this docs and use
agents for send if
my prompts for agent initialize alum
new and use agent conversation react
description my context exam
and I get h docs
and add in my context. My prompt is very simple,
my context regression.
I use my temperature zero dot h and
use chains for ROM.
Run this and
my response is the ESPN number
is here. Very simple,
very easy. Thank you for watching. I see
you next.