Conf42 JavaScript 2022 - Online

Learn BabylonJS to Create Your Own 3D Metaverse Environments

Video size:

Abstract

Users expect immersive experiences that can not only be entertaining but create engagement and retention.

This workshop will explore some of the theory around user perception while learning the BabylonJS library for creating 3D environments in a web browser.

Additional hands-on sections explore extending this environment to include live streaming media and communications between participants in that metaverse environment. For real-time interactivity with sub-second latency we’ll cover some of the same techniques with WebRTC being used by major companies for broadcast quality workflows.

Summary

  • Griffin Zahal at Kale is excited to talk to all of you about building a browser based metaverse with Babylonjs. We're going to take a look at 3d environment and game development all within javascript.
  • Babylonjs is a JavaScript based library for 3d graphics rendering. Traditionally this is in the game development space, but it isn't limited to game developments at all. Uses in education, blockchain, data visualization and others among the game production areas.
  • Playground Babylon Js is an NPM package that allows you to create 3d environments directly in the browser. It's very dynamic on how you're going to be able to make it happen. We can use camera attached control to let us move the camera dynamically.
  • Next we have the lights. And lights are very important because without lights, you won't be able to see anything. The sphere itself has some subparameters that we're going to be putting directly into it. Babylonjs does have pre built objects that you might want to build directly in.
  • Babylon playground lets you create scenes directly from 3d assets and concepts. You can even switch scenes directly within the environment that you have. This also includes areas like VR and AR experiences.
  • Babylonjs is a tool that allows you to create your own virtual reality experiences. It has built in support for VR headsets where you can try out a lot of these different playground examples on your Google cardboard device.
  • Using render loops to modify models so that they are updating live without needing to keep on rerendering your scene over and over again manually. Using videos as textures where yes, videos can be pre recorded, but still allows you to take an actual 3d environment and then overlay video directly on it.
  • Dolby IO live streaming allows you to be able to stream live content in extremely low latency. It enables real time broadcasting directly in metaverses, which we've seen a lot of very good use cases for. Once again, using an HTML five interface as long as it's accepted in your web browser.
  • You also have the ability to add in spatial audio via Dolby as well. There are a lot of web based ways that you're able to do live communications between different people. Make sure to check out the community resources like Babylonjs community. If you have any questions, please feel free to tweet me.

Transcript

This transcript was autogenerated. To make changes, submit a PR.
Jamaica real time feedback into the behavior of your distributed systems and observing changes sections errors in real time allows you to not only experiment with confidence, but respond instantly to get things working again. Conf 42 Javascript thank you so much for coming to my talk today. My name is Griffin Zahal at Kale and I'm excited to talk to all of you about building a browser based metaverse with Babylonjs. I am a developer advocate at Dolbyio where I work a lot in the audiovisual space, including live streaming video production, post processing of audio, and we're going to be talking a little bit about some of the stuff that I've been working on and learning about as I've been kind of going through this journey where we're going to be kind of taking a look at 3d environment and game development all within javascript. So let's talk a little bit about the agenda about what's going on first. So I'm going to start off talking a little bit about what is Babylonjs in the first place. Then we're going to go into a couple of different examples where we're going to get started, we're going to see some code, we're going to get building, and this is all going to start with a hello world, then talking about some 30 assets and concepts, and we're going to continue on further in areas like animating characters, adding videos directly inside of our textures, adding live streaming directly within our web based game environments, adding spatial audio and perception, and then starting to wrap things up with some examples, learning more in areas like that. So without any further ado, let's get started talking about Babylon JS so Babylonjs is a JavaScript based library for 3d graphics rendering. Traditionally this is in the game development space, but it isn't limited to game developments at all. It's used for proof of concept, you have 3d models and it all uses HTML five and WebGL to kind of power everything together. This is very good for cross platform game development where if it can run in a web browser with HTL five and Webgl, it can run Babylonjs. So any kind of environments with the Internet connection or even just having local files, being able to do it with a localhost server is going to be able to run it and it is Microsoft backed. So a lot of engineers who actually worked on Microsoft environments started working on this where we've seen a lot of uses in education, blockchain, data visualization and others among the game production areas in the first place. So what we see in the image over on the right is the actual Babylonjs environment. So this is the playground that Babylon is able to let you access directly within the browser. You can update the code live and see the models change directly on there and see all the different environments you're able to save them. And we're going to be taking a look at that in a little bit. So here are some examples of some stuff that was built on Babylonjs. We can see a couple of big ones. So the top left over here has Minecraft that was recreated directly in Babylon Js. I know Microsoft acquired Mojang, so doesn't surprise me that much that it was built directly on there. But we can see a couple of other ones like Temple Run was built in Babylon. We have some shooter games over here, some pachinko stuff going on. Space invaders recreated in a 3d environment, which is pretty cool as well as some non game related things. So this center right bottom one over here is a map interface where people started taking a globe and kind of tried to Google Earthify Babylon Js to be used in the web browser as a proof of concept. And another one that's also really cool in my opinion, is this lower right one over here is an art gallery that somebody implemented directly in Babylon Js which kind of starts talking about the whole metaverse vibe where especially in this remote world where I'm giving you this talk remotely in the first place, a lot of people are trying to kind of come up with ways for people to kind of display their exhibitions live via the Internet. And Babylonjs is kind of a foolproof one that doesn't require use of very strong tools and hardware that you need to kind of be able to render all this in the first place. So let's start off with hello world. And for this I'm going to go over to the browser and this is a website called Playground Babylon Js over here. This is what happens exactly when you load it. This is the hello world that they give you from loading their playground. You have the code ide over on the left and you have the actual 3d environment to play with on the right. And this whole code mesh is what you're actually seeing being rendered over here. This is an NPM package. So of course you need to make sure that you're importing everything properly. But on the playground it does all that for you in the background. So all of the juice that comes from starting Babylon js comes from this function over here. Create scene. And create scene is how we're determining what we actually want to be rendered how we want everything to be controlled. Every single thing, essentially that Babylon is going to work with needs to be within this function. So that we can actually establish the scene. So the first thing we're doing is we're creating a scene. It's a basic scene object. No meshes. Just letting us actually have an environment to work directly in. And you can see that we're defining it by taking a variable scene. And we're assigning it to a new Babylonjs scene. And what's really cool about this is that because Babylon is a package that we're able to manage. It will let you autofill a lot of these as you kind of see fit. So these are known kind of modules that being to be able to automatically put in. So new Babylon scene engine is going to use the Babylon engine to create a new scene for us. And this scene is going to be where we're going to be able to put everything directly on. Next we have the camera where we're going to be able to determine how we actually want to view the scene. As kind of the viewer looking inside of this environment. You need to determine how you're actually seeing it. So what we're doing over here is we're using Babylon free camera. And we're using something called a vector three over here. And a vector three is essentially the XYZ coordinates that are going to allow you to determine where in the three plane of existence we want the camera to be established. And is going to be the origin. The center of it, it goes into positive. It goes to negative. It goes into decimal values. So that you can place things exactly where you want it to be. So you can kind of see over here with camera set target vector 30 is going towards the origin. It's a nice little kind of speedway to go to that area. And we can kind of change the location of all these things as well. So if we go back over to the vector three over here and we say, I don't want this at negative ten. I want this at negative 15. I can then click this play button over at the top. And you're being to see we're zoomed out a little bit more. I can revert it back to ten and click play. And we're back to where we started. This is going to allow us to kind of play with just where we're starting, where things are going on. But you don't have to kind of play and rerender every single time you want it to happen. Where we can use camera attached control over here. This function to let us move the camera dynamically. So instead of having the camera be a fixed object. We can determine its position a little bit differently as we move along. So this one is allowing it to be attach control canvas. True is attaching the control of the camera to the canvas itself. So what this means is that we can click and drag our mouse on the canvas. And that is what's going to control the canvas. Based on the mouse movements. We are moving the canvas to move the camera. So you can also change it to move via the keyboard. Or if you attach a game controller, you can use the joysticks like you would in that area. It's very dynamic on how you're going to be able to make it happen. Next we have the lights. And lights are very important because without lights, you won't be able to see anything. Kind of think about it as the sun in earth. Where if there wasn't for the sun, you can't see anything. So this one we're using a hemispheric light. There's a few different types of light sources. That have kind of different ways that they radiate themselves out. Hemispheric light is one of the more common ones. Especially if you're trying to kind of emulate the sun where we're creating it. And once again, we are positioning it on so directly overhead. And we can also determine, say, want to indeterminate area. So we can make it 00:11 rerender and you're going to see the light now is at a slightly different position on the sphere that we have over there. It is completely dynamic and customizable. Where you can also create functions to make that change along the way as well. So light intensity is going to determine how much luminosity that light has. It's going to get lighter and darker. Default is one. So we can change it back to one from 0.7. And it's going to make it a lot brighter. If you have actual color in that. It's going to make different saturation values and all that. This is just a gray, uncolored surface. And then we also have the shapes where we are determining a shape right here, a sphere. So that we can actually determine what it looks like. You do want to put in the position of everything as well. It will default to the origin. But we can also determine what the sphere is actually being to look like. So sphere itself, the object has some subparameters that we're going to be putting directly into it. You'll notice that all these different things are going to give you a name, it's going to give you a position or some other sub options. And then it's going to tell you which scene you want to but it into. We're all putting it into the scene variable that we started directly with at the very beginning. But we can also take a look at some of these optional parameters. And if you hover over them, it's going to give you a little bit of a hint about what it actually is looking for. So the diameter two is pretty self explanatory. It determines how big you want it to be. We can make the sphere bigger by making the diameter equal to five. It's going to be a lot bigger. And we can revert it back to two, make it smaller. But what's also very interesting is that this is graphics rendering in 3d. So you're going to have to determine how you're building this in the first place. The number of segments allow us to determine how many kind of polygons you want it to be made out of. Kind of, if you think about that. So 32, we have a pretty realistic sphere. If we zoomed in a lot, you would definitely be able to see some hard edges instead of completely smooth and round surface. But we can actually make this look a lot more blocky if we reduce the amount of segments. So if we move from 32 to 16, we're going to start be seeing a little bit more of a fuzzy outline. But we can also go even lower to something like two. And you're going to be noticing it's extremely blocky right now. And this is going to be easier on the rendering engine, whether you have a sulfur or hardware renderer going in and over that, and also allows you to kind of customize a little bit about how things look directly in that way. But let's say you don't want to work with sphere in the first place. Babylonjs does have pre built objects that you might want to build directly in. So instead of create sphere, I can say let's create a cube. That is not the function name. We can do create box. That's the name. And that is going to allow us once I correct the text over here. So once we remove all the sphere specific stuff going on over there, and we move all the sphere to a box, we can then click play and we will then get a box instead. Additionally, we also have the ground section over here, which is just determining this kind of rectangular plane over here. That's just another object that we have over there that we're determining as the ground, you can determine the size and the scene the same way you would with everything else. It does not need to be a square. It can be a rectangle, it can be a circle. It can be whatever you want it to be. Now that we've gone through some of the introductory hello world that Babylon playground presents us, we can go and talk a little bit about some of the introductory concepts that we just learned about. So the main method that we have is create scene where this is where we generate objects like our cameras, our planes, our lights and our meshes. Where we use functions like mesh builder, free camera and hemispheric light to be able to determine these objects that we're placing directly into our scene. If we want to determine the position where it's actually all happening within our scene, we want to use vector threes so that we can define the XYZ coordinates of these objects that are all relative to the scene. And then we have autocompletions given directly within the NPM package so that if you're using an IDE like vs code that supports them, you're able to get a little bit of help on actually building it without needing to know every single function, revisit the docs, cross reference everything and start all the way over there. So with 3d assets and concepts, we're able to expand on these hello world concepts a little bit further, where we can do things like changing the colors of the objects, or we can even directly load files where if you have an artist, for example, where you're building a lot of these assets and models in a program like Maya, where you have a professional actually working on them, instead of trying to create a bunch of boxes and spheres and putting them all together for what you want it to look like in the first place, you can directly put it in with that mesh itself. You don't have to kind of worry about needing to develop it all to yourself. We've been able to use the mouse camera, navigation, and you can even do things like switch scenes directly within the environment that you have. This also includes areas like VR and AR experiences, which we'll get into a little bit. But first, let's go back into the playground where we have this model that we were able to download directly from the Babylonjs community, where you can see it's this very detailed skull that an artist made. So instead of needing to kind of put together all these different meshes, we're able to just directly upload something and kind of use it as we want, look around, kind of see all the different angles of it and play with it in our own workshop. We can even take a look at another example we have over here where we're able to take our click and drag controls and make it even more kind of advanced, where we can click and drag any of these different shapes and move them where we want them to be. We have added color directly to them and we're able to kind of expand the scene and the light to fit our needs a little bit. This also includes some tools that are very useful for the actual process of building, where if we click on this debug section, we have a couple of new menus that we're able to take a look at. We are able to see the scene where we can see all the different materials that we've worked with. So we have all the different objects in these kind of different colored ones where we can see the actual position. We can see all these different class objects where we can determine textures, lighting and all of that directly within a debug menu. But if we click this gear icon in the inspector area, we can also do things within Babylon like take screenshots, record videos, make gifs, even export these directly. So if you wanted to save this object and load it somewhere else, you're able to do that directly within Babylon itself. If we go over to the original playground that we worked with as well, we've made some customizations. And one cool thing that you can do over there is the save icon over here is going to let you kind of create like a little demo thing where once we save it, you're going to notice that we have a new little hex code over here on the top menu, which is going to allow us to save and reload this anywhere that we want. So we can save this customization that we made directly to this workspace. So if I refresh this page right now, you're going to be able to actually see this once again exactly the way that we left it. And you can share this with anybody you want. So if you've made some customizations, or if you've made your own custom Babylonjs playground scene, you can share it with anybody that you want in that scene. Or you can download it just the way that we shared over here and be able to upload it as its own Babylon environment in the first place. Going back over to talking about VR and AR experiences, that's not something that Babylonjs prevents you from doing any. It actually has a lot of built in support for VR headsets where you can actually try out a lot of these different playground examples on your Google cardboard device. If you have one of those still, and even has support for WebXR frameworks, which is update and replacement to WebVR, which actually means that you can develop games and experiences in the metaverse for Oculus Valve Index, HTC Vive headsets, and you can even check out the documentation on how you might want to build that yourself. Taking a look at what that actually looks like. We have the community demos over here where we can take a look at some of the different meshes. So this one for example, you can see the playground has its own hex code that it's playing with over here. And what they are doing is they have the textures that they are actually importing directly from different web URLs. So you can see that this asset is stored as a GitHub texture that they have in their git, which is just being referenced via the Internet. The beauty of HTML five, it works very similarly where if you have a model you want to import with all the textures involved directly in it, you can take a look at it. And here we have a much more detailed environment where these textures are very detailed. You can see we have a nice background going on that's in a 3d space. We have a very detailed bottle with a drop shadow, but you can see kind of the light source is over here. This table is basically the same thing we had with the plane over there. We're just taking a little bit more detail and care into the objects that we're directly putting in. And on the VR side, we have another example from the playground over here where somebody took this car. I think it's the DeLorean if I recall correctly. And they allow you to do a VR experience as a sample. So you see this little VR goggles button over here. If I was viewing this via a Google cardboard, which I'm not, but you can kind of see where it's coming from, you would be able to view this entire environment in that 3d kind of space via that headset itself over there. Of course, I'm not streaming from a VR headset, so you might be a little bit disoriented the same way that I was. But it's very cool to know that a lot of that is directly built in. And if we look at kind of what's happening, they're using create default VR experience. Looks like this is an older example that's a little bit deprecated, where we highly recommend that you take a look at the actual documentation of Babylon for the updated ways of doing it. Though this one does still work as we saw before. It's just very easy for you to take your 3d environment and directly put it inside of that metaverse. So continuing on with our presentation, let's talk a little bit more about the animation section of everything, where we have a lot of static scenes that we've already seen. Everything has been rendered, and we've just been looking around the render. You don't have to keep everything static. You have the ability to use render loops to modify models so that they are updating live without needing to keep on rerendering your scene over and over and over again manually. So this can be done with the scene on before render observable, where this will do kind of just a loop within the render function itself. So it's always looking for some type of input or some type of action to determine whether it should start or stop doing that new render. So this can be done for moving characters. This is useful for actually physically moving them, or making them do different emotes, or have idle animations. You can assign this to key presses within your keyboard or controller of movements, mice, or even getting creative like head trackers, you can attach it to any type of input that your device would be able to recognize. It's fair game for all this. So this example over here has what we have going on with this model. So you can see we already have an idle animation going on over here. But I can use my keyboard to make her move, turn, and I can even press a button to make her dance. So this is all directly within one instance. You notice that we're not manually rerendering anything. I'm not clicking a play button over and over again. So it allows you to do exactly what you might expect to do from a game in that same action where everything is live, it's moving, it's not going to just be a static workspace that we have going on over here. This expands even further with using videos as textures where yes, videos can be pre recorded, but still allows you to take an actual 3d environment and then overlay video directly on it. This also enables things like audio directly on it. And it's all based on HTML five. So it works very similar to a typical video container. But instead of a video container just being a static, flat video, we use it as a texture for an object. So we can put it directly onto a plane, we can put it onto a cube, a box, a sphere, somebody's head. If we want to actually emulate what somebody's head is going to directly look like where we can put in different sections that you would be able to in HTML as well. With playing, pausing and assigning those two buttons that you can directly interact with as well on the space itself. You don't need to assign everything to a key binding. You can put it as a user interface component within your browser as well. So we can see an example of this once again in this workspace that we've already created where we have these couple of video objects where we can open up a menu and we can see these different buttons that we made where I can click on play and it should be able to play the video for us. I think my audio is turned off right now to avoid any feedback. But you can see we're playing the video and we can also pause the video on demand, replay it, mute it. If there was audio playing right now. Oh, there it is. It may not even exist at all. You can pause it once again and you can notice that if we look around the environment, we have it showing on both sides of what is actually going on in here. And what we can do is we can take a look about how this is actually built. So I'm going to go into a vs code instance that we have going on over here where we can take a look at this main js kind of document that we have. So we can see that we're just assigning a lot of different variables. But then we have the create scene function that we saw before where we're taking a look at kind of the typical things where we have the camera, we're attaching the control of the camera, we're saying where the camera is, but when we actually want to place the video itself, we want to make sure that the video is properly in scale. So most videos these days come at a 16 x nine aspect ratio, but sometimes it's too big and it's going to take way too much power to be able to render that. So we can scale it down by just taking that 16 x nine ratio and then creating a scale ratio that we want to either multiply or divide by so that we can get the proper size that we want the video to actually be as a part of the box that we have. Then for the plane options, we want to set the height and width that we have already determined over there, as well as determine the side orientation. So by default it's not being to appear on every single side of a mesh. So we can use this double side helper function from Babylon itself to actually put it on both sides of that box and see it the way that you would want it to be. Then we finally call that video mesh that we have over here with video plane. Mesh is mesh builder. We're creating a plane and we are putting in the actual objects and scene directly in them, where we're then taking the video feed as a local file we have and letting the video actually be placed on side with some more helper functions over here. So we're creating a video material and texture as the function over here. And then we're placing it directly on side of that video object that we have, determining if we want it to loop, if we want it to autoplay, and then also adding pause functions directly on there so that it doesn't kind of just play immediately as we go on there. We just have it playing when we want to play it by clicking the button itself. A lot of this is additional helper functions so that you can kind of see it all happen altogether on that first place. We are talking about the metaverse forever, and I don't think it would be that much of a metaverse if we only had pre recorded videos. So let's also talk a little bit about adding live streaming as video textures. So Dolby IO live streaming allows you to be able to stream live content in extremely low latency, where we are no longer limited to existing video files. We're able to actually broadcast videos in real time. So you would do the typical things about acquiring an API credential at Dolby, where you can get one for free, where instead of assigning the source object to the video file, we can assign it to the stream URL that we have received from Dolby, which enables real time broadcasting directly in metaverses, which we've seen a lot of very good use cases for in live events, concerts, lectures, the live sports scene, auctions, kind of, it goes on for a while. So we can take a look once again at the sample that we have over here. And if I switch from video to stream, you're going to notice that the videos disappear. And that's because our stream hasn't started yet. Over at our Dolby creating dashboard I can click on start right here. It's taking my live camera feed. Hello, let's mute the microphone. We can now see that I'm being streamed live in Babylon over here. So it is the same feed. You can take a look. It is using those same objects. But now I'm able to put everything directly inside of the stream. I'm not sure how well it's going to do the capture difference between the camera you see over on the top over there and the camera that you see here in Babylon. But hopefully it should be broadcasting pretty quickly to you and we can take a look at how this is all working once again in our code. This is the same exact file over here, but if we go a little but lower down where we're doing things like determining the GUI buttons and doing some parsing and all of that over there. But what we can do over here is we look at the stream button over here, we're able to then access the video texture in a different way, where we're setting the URL equal to the streaming URL that we have determined in our Dolby API key, and we are determining that we want to switch the video texture over to that. The magic happens in our helper function over here in streaming js, where we're actually taking in our credentials. Don't bother copying these, I'm going to rotate them as soon as I finish this video. But we can look a little bit further down in terms of how this is actually being done, where we're setting the video texture and then we're adding the stream based on the texture itself going on over here, where it's functioning essentially the exact same as that texture video that we have, but just accepting a live feed via a URL. Once again, the beauties of using an HTML five interface. It is the same kind of code as long as it's accepted in your web browser. There is no reason why Babylonjs isn't able to accept it itself. So we're taking the token to authenticate, and then we're changing the view appropriately and broadcasting it and changing the texture appropriately, as well as doing some error handling directly within there. So we can see an example about how one of our customers actually has gone a little bit further than this with one of the customers we have called Red Pill VR, where I'm going to mute the video so that this doesn't kind of get too loud and feedbacky. But you can see that they've taken our live streaming and turned it into a metaverse based DJ experience. We were able to see that they have the DJ actually being recorded live, but they've implemented him into this metaverse where there's an audience that's able to listen directly live. So that if he, for example, is trying to engage the crowd by saying, hey, how are you all doing? It only takes a couple of milliseconds for it to actually reach the people on the other side who are experiencing this live concert in this live event built directly within here, as opposed to the couple of seconds it might take in traditionally based creating, which I find to be really cool. And there's no reason why you wouldn't be able to build a very similar experience within Babylonjs and even incorporating the 3d headset experiences on there, so that you might be able to actually see what's going on in as real time as possible. So continuing on with the metaverse based theme, you also have the ability to add in spatial audio via Dolby as well, where you might want to add in live communication within your game or your metaverse environment, which you can do with an audio based SDK where you can add in real time communications. I'm not going to be going over any code or experiences directly within this, but just know that there are a lot of web based ways that you're able to do live communications between different people. And putting it directly in a game would be no different. The same way that you might experience that in a multiplayer video game that you'd be playing with your friends. You can add in different spatial audio areas so that you're actually able to take the position of where you see somebody directly in the plane that we have in Babylon Js and hearing them from that direction. In that direction only instead of hearing everybody from every different direction all at once, you're able to actually more realistically hear people as you might in real life. Once again, we have a customer example of this. If you visit Odyssey stream, if you want to see an example of it, but just know that this is just kind of typical, taking the positions of objects within Babylonjs and feeding them into different sdks so you can actually see and hear that relative positioning that goes on over there to wrap things up. This is really just the tip of the iceberg when it comes to Babylon Js, where if you want to kind of see a workshop that we have done, you can visit Bitlymetaverseworkshop for self paced workshop. If you want to go more in depth on some of these experiences and actually kind of go further with all of that. It's an open source GitHub repository, which I really highly recommend you all check out. But also definitely make sure that you check out the community resources like Babylonjs community, as well as Docs at Dolbyio if you want to start incorporating some live streaming or video communication directly on there, I'm going to make sure to show off the Batlon JS documentation right now so that you all know where it is, how to find it, and also just kind of how to get started yourself. It's all just very well organized and very useful where you can even see the examples of the playground that they let you go and directly play directly with. So it's not a traditional documentation. And you can also see the workshop that I reference right here as a GitHub repository which you can then clone and then run as an interactive kind of documentation git repository that you're able to play directly with and create your own environment right there. So thank you so much for kind of listening and attending my talk. Hope you enjoyed it. If you have any questions, please feel free to tweet me down at the bottom you'll see my Twitter handle. It is at being of the Griffin and I'd love to see what you build in Babylon. If you have any questions or if you just want to show off something, I'd love to see your playground links. Please tweet those at me or just kind of say hi. I always want to meet more of you. Thanks for showing up and hope you enjoyed the conversation.
...

Griffin Solot-Kehl

Developer Advocate @ Dolby.io

Griffin Solot-Kehl's LinkedIn account Griffin Solot-Kehl's twitter account



Join the community!

Learn for free, join the best tech learning community for a price of a pumpkin latte.

Annual
Monthly
Newsletter
$ 0 /mo

Event notifications, weekly newsletter

Delayed access to all content

Immediate access to Keynotes & Panels

Community
$ 8.34 /mo

Immediate access to all content

Courses, quizes & certificates

Community chats

Join the community (7 day free trial)