Transcript
This transcript was autogenerated. To make changes, submit a PR.
Hello and welcome to con 42 platform engineer 24.
I'm Haral and I'm going to talk about developing and creating.
Telephone providers.
We'll talk a bit about an IEC refresher.
What's infrastructure as code?
Where do I start when I want to create a provider and how do I go about creating
a provider including a live demo?
First of all, I'm Harald Safra, Data Platform Engineering Team Lead at RISC V.
Macam manages all the online databases, everything that has to do with online
systems and access either to end customers or to analytical systems.
Infrastructure as code is a programmatic definition of infrastructure elements.
It allows repeatable, documented processes, you don't have to
click through UIs or do the same manual task again and again.
And there are two general approaches to how you go about infrastructure as code.
The first of them is declarative.
The user defines what they want to achieve, and then the platform
framework does the work for them and provides the infrastructure.
The other one is imperative.
The user defines how they want to create it.
They iterate over loops, they create the code or code like structures that go ahead
and create infrastructure elements, and we can contrast Terraform and Pulumi.
On either side,
different providers are the plugins that interface with the infrastructure.
A P.
I.
Terraform core is a framework that knows how to create a object and
infrastructure element, but doesn't have any idea about how it how the
infrastructure is created and managed.
To increase over that to create from providers interface between core
and the infrastructure API and core interfaces with a provider over RPC
as of today, there are 4400 plus providers and the list goes on and
on and the main thing to remember is that anyone can create more providers.
You don't have to be any first to go.
You don't have to be part of the infrastructure team.
You don't have to be part of the product or the company at all.
When I need to create a provider for external databases
that can be used in my team.
But I didn't, I was not part of the database development.
So anyone can create one.
As long as there's a public API that allows you to interface.
So where do I start when I create, want to create a provider?
The first is to understand a bit about the architecture of the process.
So we have Terraform Core on the left side.
And Terraform Core is the executable that you run and manages the operations.
It knows how to rate infrastructure, it knows how to route infrastructure, and
it knows how to calculate deltas between the needed state and the current state.
Thank you.
It interfaces over RPC with a Terraform provider.
That's a, that's the plugin that you will develop.
And that Terraform provider that's sitting in Golang specific, specifically
interfaces with a client library.
And that client library interfaces over native protocol with the infrastructure.
The native protocol could be HTTP calls, gRPC, SQL system calls.
REST API, anything that infrastructures knows how to understand and implement
the API.
You need for the API, you need to find the correct API to
interface with the infrastructure.
There could be old versions, new versions that are easier to work with.
But, if there's a version that has an existing Golang client for
it, I would advise to go for it because it will be easier for you
to create the plugin with that.
Obviously, as I mentioned.
The provider is written in Go, so you need to know Go, understand?
Go is a compiled and high level programming language.
You need to understand and get first thing though.
You don't need to have a specific deep understanding of it.
You can use the Google Chat GPTX to go along.
But you need to understand the constructs, control structures,
and everything like that.
There's a good step by step tutorial that I used.
Under go dot dev slash tour, it will help you go step by step
learning, go from basic primitive types, controls, infrastructure,
interfaces, things like that.
I found the simple language to understand to learn, especially if you have
knowledge under the programming languages.
It's compiled, which I like because the compiler now finds errors that you don't
have, you don't bump into during runtime.
One gotcha that I found that there are no exceptions, meaning you
need to explicitly check for error conditions for function calls.
Otherwise, your code will just panic out.
Remember that check, check return values, the underscore
ERR, return value is a friend.
HashiCorp's documentation.
Is the thing that the commission used in school recommended it's located in
their development portal within the link here and you can also scan the QR
and find that read the docs again, and they will give you a basic understanding
on how strings plug into one another.
After you've learned Go, you've read the documentation from the
HashiCorp documentation, you can go ahead and create a provider.
And I will be using a demo to make this a bit more real.
This demo is a plugin, plugin that creates managed lines in a text file.
Simple lines and simple text files.
All files are managed in a single path.
There's a file resource that has file name and the lines array.
The file API was provided for you by me and is limited.
It doesn't have a lot of primitives.
You could read line, you could write, and you can count the number of lines.
In the file, but not a lot more than that.
And you can see that the resource definition on the left, which has a file
or one and two lines like one line two maps into a file that has two text lines.
You can grab the code under my GitHub.
Repo, Terraform provider file that you can also scan the QR to get
the different plugins framework in the recommended way to create a providers
that have virus, 2 or 3 different ways and do an API specification change.
The new way is the plugin framework.
It allows.
Need to manage your business logic and focus on your business logic and
have the plugin framework do the hard nesting of connecting to core over RPC
and you start working on that by cloning the telephone provider scaffolding firm
repo and to your private repository and then you can customize it from there.
When working with Terraform, there are four basic operations
that Terraform core needs to be able to run over infrastructure,
they provision infrastructure and they amend the Terraform state
accordingly to create operation of the creates the resource.
It could create a file in our case, but we also create an EC2 server.
It could also create a user in a managed, SAS database.
The read operation reads the current infrastructure state.
The update operation changes attributes that can be changed according to the API.
Obviously, not all attributes can be changed.
If you want, if you go back to the EC2 example, if you want to
recreate the EC2 server, some things can be updated, like the SSH key.
To destroy and create it, but some attributes can be updated
and update API does that.
The last operation is delete, which removes the regions and Terraform Core
also uses that for recreate operations.
In case a user wants to change attributes that can be changed without destroying,
Terraform Core will go ahead and delete.
And recreate resource by using Destroy and Create.
Again,
if you look over the code, you can see that, this is the clone, and there are
a few things that I mentioned earlier.
The file, API that I provided for you has a few basic operations.
Read dry, count the lines, remove files, and remove lines from whatever files.
That's something that I add that the ation and examples and.
If you look the quicker resource here, file resource, you can see that each of
the primitives that I mentioned earlier has, a method associated with that.
So we have a delete method and an update method and a read method
and a create method and a few others that covered, bit later on.
And if we expand the create method, for example, we can see that it has.
A code that has to do with interface interfacing with their phone call.
It basically gets the copies of the request from there from
core into a local variable.
Then there is logic that does the operation and the infrastructure
in this case, creating a file and writing lines into it.
And then it returns return value back.
To tell from court to allow it to know what happened and
allow it to amend the state.
Accordingly,
each of the resources and data has schemas and attributes.
Schemas and attributes are the mapping between configuration
blocks to provide the code.
And they define what parameters are needed to create a
structure to change its values.
Some of them can be mandatory, some of them can be the option, but these
are the parameters that are needed to create the infrastructure itself.
Schemas have attributes that define data, specific data elements,
and each attribute has a type.
The type could be a primitive, an instance before, a string, or it could be a complex
data type like a map, an object, a list.
Each of the attributes also has properties, a description, it's optional,
sensitive, and other properties.
And it can also have optional validators, The check that the user supply values
match what the infrastructure provider.
If you take a look at the code, you can see that this is all
implemented in a schema function.
And the file has two attributes.
First of all, the file itself.
has a description, file data resource, and it has two attributes.
One of them is a file name, which should have this, description of file name.
It's required
and has a validator that map that, forces the file to have a specific, to
match a specific regular expression.
It also has an additional attribute called lines, which is a list attribute.
And each one of those It's a string value, so it's a list of elements of TypeString.
And in this specific case, it also has a validator that enforces the line to
have at least two lives, as an example.
You should note that descriptions under the attributes and under the
filename do use them because they are then copied over to the document,
automatically created their documentation.
That, that creates that basically copies the description and
filing and, an element type here.
And then you can see this combination later on copied into publishing to
terraform registry for your end users.
Types.
The plug types are not native go nine types.
They have additional methods and in, functionality to handle
null values and unknown values.
For example, in N 64 and other primitive have a is null method that
returns if it's null or not, and primitives are actually using the value.
Type method, for example, the N64 has a method called value N64
that returns a native Golang N64.
Collections are converted into Golang types with the as method.
for example, list, you can grab the list, the elements of a list
as the native Golang type with elements as, and then cast them
into a slice or array of strings.
You
should go back to the code, you can see and go back to, for
example, the create function.
You can see that the full name of the file is created by appending the base
path for the provider with the file name, which is, which is, framework string
type, but we grabbed the value, the goal and value with various string that
returns it into a native string that we can use to, to append into other values.
After he created the provider.
You're quite happy that you want to run it, but you want to see that it works.
first of all, you don't have to publish the provider into
there from registry to run it.
You can use to run it locally.
You can use that, do that by using the dev overrides.
Is stanza inside the Xena configuration file and then send the TFC like a
shake file to that's fine for in this example, you can see that the
C like on shake file has develop rise and maps that the registry
address into a local path on my Mac.
You can use log based debugging for simple cases, but you can also use
debugger based debugging, which is far more powerful for complex 1.
So you do that by running the main.
Method of your provider with my dash debug equals three.
It will output an inbound driver.
You copy it to wherever you want to run, and then you run terraform action and
that the action will break the debugger.
Let's see how that works.
First of all, I have my code here.
Let's look at the run configuration and see that everything works correctly.
So we have a debug and you can see there's a program.
Agreements have dash debug equals true.
I said the break point, for example, let's say I want to debug the read operation.
So I set the break point here and then I run the program.
If we compile and then output an environment variable, which I copy
and then based here
as it is.
Now, this directory has a configuration for Terraform.
So it has a provider block, a provider file that defines the base path,
which is This is that directory.
It also has a resources file that defines 2 file resources, file 1 and file 2.
You can see that each file has a few lines.
And if we cap the file 1, for example, you can see that it
has 2 lines that are defined.
As I mentioned earlier, I exported the TF3 attached providers, and now if I run
Terraform plan, it will start, and then it will pop up the debugger, and you
can see that the Terraform run itself is waiting for the debugger output.
You can use the debugger to just basically scroll through.
Code your code like everything else.
And you can see, for example, that the full name here that is created
is a read operation for file 2.
If we want, we do the program again, it will continue running and then
it will end and refresh the state.
If you don't need that, and you can see that it's just the
state and the real equation.
And in this case, there's no change in the file.
So there are no changes infrastructure.
When you're done with that, obviously you want to stop the debugger.
And then we can go back to the presentation
after you debug the code, you're happy with your test and automatic test
and accepting this for your resources and data sources and to allow you
both to check the code is working correctly and what to be later on
interface into the GitHub actions.
The state is already checked in basically in this test suite, and if you want
to create a test operation by yourself on the infrastructure itself, you need
to add them to the acceptance test.
You can run the test manually with make test, and
the way it happens is to create a resource on the score.
Test dot go fight in the same directory and the make will grab them.
So if we go back to the code, close the menu, you can see that, for
example, the fighter sauce has a file resource test underscore go.
And this has function.
Each function is a test that has pre check steps for the test.
And as I said, the test, the state of the check in place.
For example, in this case, it configs with a function that we'll see a
bit later, and then it will need to create, create a file, like if I
want, it has 2 lines, 1 and 2, and it checks the attribute, just checks the
attribute that the file name is file 1.
The second step in this step is to actually update file one with lines two
and three instead of one and two and check that the lines and the first and the zero
place in the first line is equal to two.
Both of these functions use a set of function.
Test X, but it could be whatever you want, just returns and a term
from configuration file, basically.
So we have a provider stanza and a resource stanza.
And you can see that the values are pending into here just to
make creating configuration files easier instead of just writing them
here explicitly again and again.
When we're done with that, you can, you can run the acceptance test, and in
this case, they passed, which is nice.
You created your code, you debugged it, you created the acceptance
test, you created the commutation, you are all fine and happy.
You want to publish it and therefore register.
The first is create a GPG key.
You'll use for and definitely registry will use to validate code is indeed yours.
Send the GitHub secrets GPG private key and pass.
Raise accordingly.
Create a Git tag named V.
The V version would be V 0.1, 0.0 V 2.3.
Make sure that the funnel sematic versioning, and.
This tag is what drives the GitHub action that's included with a telephone provider
by the framework and that action builds.
Both, both all the executables and also pushes a WebBook to read to
Terraform registry to allow you to grab new, the new resource.
So you need the tag, really.
The next thing is you log into Terraform Registry and add the repo.
once you've done that, you authenticate back to GitHub.
it'll create a WebEx hook that will, push in new updates.
That are tagged into Terraform Registry and you wait a bit.
It takes a few minutes and for Terraform Registry to grab the changes from GitHub.
And then you can see your provider published into Terraform
for other people to use.
Every published into Terraform for the public.
So be aware not to publish any kind of proprietary code or anything
that's, Let's discuss the secret that we can't really divulge
to wrap it up.
Me and my team started from manual management of this specific database.
we have used Terraform extensively for other cases, but in this specific database
that we need to manage, we had the manual management script and confluence.
I had to let go and learn Terraform framework.
I created a provider and release a few more features.
It's a free conversion, zero dot format, and I published a provider and my team
is it, and I hope that other people around the world use it too, because
it makes managing databases easy.
Thank you all for your time.
I'm Haral Safra.
You can find my contact details LinkedIn and get up, get a profile below.
I would be more than welcome.
Any questions.
If you have anything, reach out to me and we got to help
with anything that you need.
Thank you for your time.
Thank you for being here with me.