Abstract
In this talk the audience will learn how to develop an Android app to perform Video Editing.
We will use the services provided by Huawei HMS Core to create in 5 steps a cool app capable to give your user the power of editing awesome videos with the most modern capabilities, custom UI, stickers, AI filters and other cool features
We all know video content is the king nowadays and we all know as a developer that building a Video Editing app could be hard, from design to development. Using these SDK will make this process trivial.
Transcript
This transcript was autogenerated. To make changes, submit a PR.
Hi everyone, I'm Giovanni Laquidara, developer advocate in Huawei and
today we will see how to develop an Android video editing application session
in five simple steps. Nowadays we are surrounded by video
application. Video is the main content and
the video application are the most all loaded one in the stores.
All the videos that we are sharing are not only the recordings
of our moments. These are also in a sort
of sense customized by us, by adding stickers,
sound effects or even virtual object on top of it in a way
that we will feel like this is impossible if you have to think about it
many years ago or maybe too expensive to do, but this
is now easy to be done by you and
by the user of your application. So to create
this kind of video, we can install video editing apps. The most famous
are like here in the slides from Adobe Premiere.
Rush plies quick and sometimes they
are really complex to use and the people
that want to download them and many times the
application of the socials are
already integrated inside them. A video editing feature
that allows the user to create, record and augment
these video that he will produce by using these app.
Especially if we consider the Android world. If you
go to the Google Play core, look for video editing
apps, you can find a lot of application. But it's a common
opinion that the quality of this app is sometimes part of what we
can find in the iOS world.
This is also due because it's hard to use the app
for Android. About video,
about all that. It's about
video editing. So what if I as a
developer want to develop an application that allow my user
to become a director, a video editor? What can I do
in Huawei? We can help you and we develop HMS
core video editor kit. This is an SDK
that is part of the big family of the HMS core services.
And these is compatible from all
the Android smartphone running Android seven
and above. And it's offering a lot of features like video
editing function, really extensive
material library, multireson export,
multiple multivideal or image import supporting
a lot of extension files. So it offer a
really big set of feature. We will see
today how we can use this kit in five simple
step but the prerequirement of these steps
is just to record to register
yourself inside the AppGallery developer console.
To do this, you have to create a Huawei id and this will allow
you not only to assess the power of HMS video editor kit,
but even to assess all the feature that are offered
by HMS Core. So to register, just go
to developer huawei.com and just compile the
form using your phone id, choosing the
password and so on. And in a couple of days you will be allowed
to access the app gallery console. This is like
the main UI throughout. You can manage all the apps
and all the API that you can use and
all the apps and process you can create using the app gallery services and
HMS Core. But just remember, if you
want to create an application to select new app,
select these package type of the application. So if
it's an app or an IPK, these is our kickweek app
site. Even choosing the device where
the application will run in like in this case we
choose mobile phone and don't forget to put these app
name and so on. All the information about the applications and
the AppGal content console is also important because if you
go to general information tab you will see all
the API key that you will need to run to connect your
app with our services API. And even from this
webpage you can download the heconnectservices jSon. That is
the main configuration file that you will have to include inside your source
code of your Android application. It is similar of the Google
Pay services JSon, so it's just a configuration file that will allow the application
to be interfaced with the services provided appGallery
provided by the appGallery console. So let's start
together to see how we can use the HMS video kit.
So first of all you have to go to
these main developer AppGallery
connect console and then from that part
you will be able to see your project.
By clicking on my project here you can select
your project and then from general information.
Like I said before, you can see all the information about your project.
If you select manage API tab and
you want to use the video editor kit, you have just to enable the
tick box with correspondent video editor kit.
Just enable it and you will be ready to use it inside your application.
Enabling this option will help you to
access the video editor menu. The video editor kit menu.
It's a menu that you can find on the left side of
the interface and if you go there on the left side
click on video editor kit. You will see the main user interface
of the video editor kit web app gallery web interface
so in which you can manage all the resources that belongs
to your project. After that you
need to remember to download and put inside your Android application
directory the file, the configuration file. So EdgyconnectServices
JSON file and this will enable your app to connect
with the gallery services and also take notes of your
API key. Because this will be used in
your source core to enable your application to
connect with the API services that we are offering all this information.
So the G content services JSON and the API
key, it's accessible for you inside the general information
tab. So this is the first step. Second step
second step is open your Android application project.
Go inside your build gradle file,
the one that is belonging to the project. Add the main
URL of the mammon repository that will point
to the repository of HMS Core from Huawei. So add this line
of core inside the repository blocks into your
build gradle file belonging to your project.
Remember to add also the dependency of the
AjI connect so appgallery connect plugin inside
the dependency block of the buildlock gradle file and this
will allow you to use the classes that
will parse the edgyconnectservices JSon file
so it's all done for you. Another step
is to add these dependency inside
the building gradle file of your application of your app.
So inside the directory of your
app build gradle file create add
the line corresponding to the usage of the edgy content
plugin from Huawei inside your builder graph file.
Then add the line corresponding
to the video editor SDK to your builder
graddle file as an implementation. So plain and
easy. The third step
is to declare the permission that your application will
need to use the power of HMS
video editor kit. Like you can see there are a lot of
permissions. So the first question from you can be oh, why do
you need to use all this permission? So first of all, we have
the Internet permission because it's a services that
is API based. So it will need the Internet connection.
It will need to write and read from external storage because you will have
to import some resources from the storage and
even to export your video editor, your videos file
inside the external storage. So you will need this permission.
You will need these permission to record the audio from the microphone because
maybe you will want to add the voice or
some recording sound inside your video editor project.
And you also will need a vibrate permission because we
are using these in sour SDK to handle
the main user experience of the video editor kit. We will see
in a bit that when
the user will use the main UI of video editor kit,
he can add resources on top
of the video. But if it adds resources on top on
a video in area that is not allowed to, the device
will be vibrate. So that's the reason why we need a permission for
the vibration. So step three done step four,
let's go in the code and the step four is about declaring
a callback. This callback is
called Media export callback and this step
is mandatory because it's needed
by the video editor SDK to know what he
has to do when the video exported is exported.
So in this case in our sample in the slide, we are just putting
a log. So plain and easy.
Step five so the last steps
is just to create the code that is needed to configure these
the video editor and launch it. So first of all we
need to declare and assign a license id.
The license id is an id editing
the user of your application and you can use these to
assign some quotas to your user. So just assign
random uuid it will work at
the moment for this sample. After that you need to authorize
the video editor kit inside your application to
communicate with these video editor kit part server part inside
our cloud services. To do this we have just
these metal set API key with the API key that can be
found inside the itchyConnectservices JSon or you can
look for it from the appGallery
Connect main UI inside the
web services. After that you need to assign
these media export callback that we defined in the
previous steps on the step number four. To do this you just need to
call set on media export callback with a callback to your previously
decredit. After that you just need to
launch the video editor kit and to do it just
call media application get distance launch editor activities with this parameter.
These are the full parameter. If you go inside our documentation
page in developer orway.com you can find all the parameters
that can reach this call with a
new feature with farter feature. So if you do
this, it's done. Your application
will launch our video editor kit
UI that will allow your user to maybe
import some resources like some videos. Move these
resources inside the timeline to play pose
or assigning like sticker
or effect on top of it like in this video we can see
that I wanted to assign a sticker, then I changed
an idea and then I wanted to write some text on
top of my video like Chow and I can move the
text on top of the videos that I imported.
If obviously the user want to move the text
outside these video, there will be the replression from the smartphone,
the one that I told you before. So this is the
main UI of these video editor application. So we
are offering these services for free and just
with five steps you can achieve this.
So battery power already
battery included services for video editor
for video editor feature so your fifth question could be yes,
but we are talking about the stickers, maybe sound
effects transition, where are all these resources?
And maybe can I upload mine? Can I use my resources
as well? So first of all,
I said before that assessing the
video editor kit from the app gallery console
will enable you to enable or disable all
the usage of these resources so you can enable the stickers to
be used. You can select which stickers are used
and even selecting the stickers,
choosing them from our extensive library. But you still
cannot upload your own resources. You cannot upload
your own png or gif file
to be used as stickers. This is a feature that we
will provide in the future. So at the moment you just assess
these video editor kit Main UI inside these app gallery
net console and select the resources that you want to use
for your application. So even this plain and easy,
the next question you can have it could be
like okay, I have all these videos
editor pre configured. The UI is pre configured. So like
it's a fixed UI. So what I have to do if I want to do
up some features or I don't want some feature, like for instance,
I don't want to allow my user to add stickers to
their project, to their video project or how
can I customize the user journey? How can I just simply
modify the appearance these these of the main video
editor UI. What can I do? So what I showed
you before is what we call the UI SDK of
the video editor Kit SDK.
But we have another SDK that
will enable you as a developer to
customize even more the feature of the video editor kit that
you want to use. And this SDK is called a
fundamentals capability SDK. So this will
enable you to have the superpower to configure whatever
you want.
As long you want to customize it, you can use it straight
away. So let's see what we can do to use
this. So first of all, don't forget inside the build gradle
file of your project to add the edgyconnect plugin. Never forget this.
Second is instead of adding the dependency line
of the video editor UI, just add the dependency
line with video editor SDK. By the
way, this is in these slide is the last version supported by us.
So just add implementation or to away HMS video
editor SDK. That's it.
After using this, after adding this dependency, you have the power to
use all the feature of HMS video editor kits and
to customize it. But I want to leave you with the
main structure of the video editor kit fundamentals
capability SDK because you will need to know how to
build your own UI, how to enable your
decided feature with it. So first of all, we have the main object. These is
these Huawei video editor. This object content inside himself.
An object is called Timeline. This timeline
object, it's the object that is containing
all the timelines of the resources of the video editor,
video project, video editor video editing project. So like
there is the video lane, there is the audio lane, and the
video lane is the timeline that
contains all the video associated to these video project.
The transition, maybe some effects on
top. The audio lane will contain just the audio.
The sticker lane will be like the stickers contain the
timeline containing the stickers and the all
day lifecycle. And the effect also is
the sound effect and video effect timeline
for it. So this is the main structure to
essentially to use the video editor component
that you want to use in your project. You can just call these create
of Huawei videos editor kit. Or if you
already created just use the gate distance.
That's it. When you
created this video editor object,
you can just prepare
it for launch with the init environment. These will initialize for
you all the things needed for the usage of video editor.
But just don't forget that because the editor is
something associated to these user interface, it's something that
will be associated to a layout. So don't forget before to create
a layout associated to the editor.
In this case in this sample, a linear layout will work and
then to assign with the set display and pass in the
inside layout. Just assign this to your editor.
So plain Edition as it is to assess the timelines
that I show you before on the main structure
slide you just need to call the get timeline of
the editor object and from that you will be held
to assess. Maybe if you want the video lane. So the
video timeline accessible with append videos
time video lane colored by the timeline
object.
Using the video lane you can append
video or image just using your own resources.
So you can upload your own resource inside your own server or maybe integrate
some video to be some video resources
inside your application image from your application and
you can append this one with disk methods inside your
main video timeline. If you want to use the
sound timeline timeline, you just need to call timeline
append Audiolane. You will in this way be
able to use the audiolane and with this audio lane you can
append audio on top of your videos editor
project. So with the append audio offset passing
inside the main resources the name of the file
that correspond to the sound that you want to associate or the
music to use the sticker.
Just get the main timeline for
the stickers with append stickers lane and then you will be able
to attach stickers on top of your project using append sticker
asset. Or maybe if you want to append text you just
need to call append word with the text that you want to use it and
decides that's it. And all
this timeline can be actionable
by command like play and pose.
We have the method for it. So just call editor set
play callback if you want to assign some callback to the play
action or if you want straight away to play all the timeline together,
just call editor play timeline with the starting time and
the ending time of the playback.
Great. And in the end if you want to export
the video project that your user created, you can
define the extension file. You can define with
these line of core the property of the project that
you will be exported. So the final video and
you can export the video calling be export manager
export video with all these parameters that you
will need. So this is for the final steps.
If you want to navigate throughout the code of
our samples, just go inside
our main GitHub repository. So GitHub.com HMS core
and these you will find the HMS video editor demo
source code in which we divided
the two main SDK behavior.
So you will find these project for the UI demo, the one that I showed
you before and the project for the fundamentals capability SDK
inside the directory SDK demo. So try to explore
the source code and to play with it and to see if
it actually can be useful to your idea
to create video editor project for your video editor application
or video editing feature for your
user. If you want to learn more. Also you
feel free to go inside our main developer
documentation website. You can scan the
QR code on the left to go inside the HMS core official website
and if you want just if you are curious about these video editor kit
but you don't want to look to the source code that you want just to
use the main mobile application to these the feature.
Feel free to scan the core code on the right and
download and install the demo application like I said in the beginning is compatible
with all these Android smartphone running Android seven
and above. So yeah, thank you
so much for attending this talk.