Conf42 Incident Management 2024 - Online

- premiere 5PM GMT

Using AI & Python to Prototype Financial Analysis Solutions

Video size:

Abstract

I leverage the power of AI and Python to rapidly build prototypes for financial analysis solutions. If you’re interested in streamlining your financial analysis process, let’s chat!

Summary

Transcript

This transcript was autogenerated. To make changes, submit a PR.
So we're going to be talking about building software prototypes using AI tools and the programming language of choice, of course, is Python. And we will be discussing examples of time series analysis and the domain specific area that we will be talking about is financial analysis. And we'll be using this kind of a capability to build an algorithmic trading system. So let's get to it. Now, we are familiar with all these. Concepts of machine learning, data analytics, rules based data, driven. Decisions. However, the application off building software using these E I tools is very recent innovation. Thanks to impart by the genAI tools that are now available for us. So we'll be discussing how we can adopt and adapt these AI tools in building these financial applications. But before we get to that, let's look into what are the base skills necessary to achieve something like this, right? It's not set in stone that we should follow one approach to the other, but more importantly, the technical skills and those and the business skills that are essential to build such applications are very much important because an AI tool is just a tool. The more the success or the secret sauce in all of this is the individual or the human being with the relevant and required skill sets necessary to achieve this. So there are certain foundational skills, there are data specific skills, there are problem solving skills, there are professionalism and collaboration expertise that are required. And of course there is this knowledge that a person should have to be able to build such things. And. Not the least interpersonal skills, because if you're not able to communicate and not able to collaborate, we will not be able to build things at scale that we are accustomed to. All right. So the next thing is, we all are aware. Where we are today, but it's important to realize how we got there. So in the 1950s to 1970s, the early foundations of the software development was made. And from there on, we were, we came across this object oriented programming concept. And then, of course, there was this internet revolution in the 90s. But why I'm discussing this is we have to realize that this journey that we have come across where we are today in this AI revolution that we are in today. So every stage of every step that we passed led us to where we are today. And more importantly, this is where we have to consider adopting to such technologies because over time, things evolve. So what is, Important to realize is in the early 2000s to mid 2000s, we had this cloud transformation journey or cloud journey that we are now aware of and how things have turned out to be. And then from late 2020 tense up to. Pandemic level. We were all, using some way or form algorithms or data driven machine learning models in our daily lives without even noticing it. For example, these recommendation engines from Netflix or from Spotify. You were having these playlists that are being recommended. And of course, there were a lot of other things that we were getting exposed to. But It was very much behind the scenes. Now, from the advent of this Gen AI tools that have now come into mainstream, especially the LLM models that are out there as individuals, as users, we are now directly interacting with AI, and this has opened up A whole new paradigm of things that are possible with the possibilities that are endless style. And I am really excited during this period because I come from a technology background and a business background. So what happens is if a person like me who understands business problems and solutions, I can now use these Gen AI tools to quickly iterate and prototype without waiting for a large team to come in and be available for me to do this. So this is. Enabling a lot of creativity is enabling a lot of rapid prototyping. It's enabling a lot of innovation, which was pretty much there in the past, but the space at which we are doing it now has accelerated. as I mentioned, beginning in the beginning of the session, we are going to be talking about financial analysis. Which is a specific domain area of time series analysis. So some of you may have already experienced in one shape or form time series analysis, right? So the first point, obviously, is the financial use case of algorithmic trading and risk management, which we'll be covering in this part of this discussion and topic. In the case of health care, this patient monitoring and epidemiology. in the case of, maintenance is predictive maintenance quality control, which is used by facilities, industries, process industries and manufacturing industries is demand forecasting and industry management across many different industries. in the in energy and power generation and power management, there is energy load forecasting and smart grid management. Again, they are based on time series data that comes in and they are able. They are managing their business around that. We all know, whether it's time driven. every minute, every hour, every day, every week, we know how the weather is behaving and forecast, and we're able to see these forecasting events day in, day out, and we interact with them on a regular basis. Network traffic analysis and fraud detection. This is again for the data networks. so every packet every information is times timestamp. fraud detection is also, in the form of time series where we know the behavior in the past based on particular time. And if it's currently not following a certain pattern. Based on the time, of course, we can determine what's going on, and then we can, behave, manage the behaviors accordingly. Economic forecasting and public health monitoring is also time based, obviously not at the granular level, but it's usually on a weekly, daily, monthly, quarterly, so those are the aspects of time series. By the way, if you see there is a LinkedIn link or details, if you have any questions around any of these topics, feel free to drop me a note. On LinkedIn, and I'll be happy to respond to that. Again, we all know AI is here. It's here to stay. Now we are accelerating the adoption and adaption of AI into our daily lives. Up to 26 percent of the world's GDP will be increased, will be contributed, or there will be a contribution of increase of 26% by AI in the next four to five years. we are really in an exciting time. Of course, some of us are a bit concerned that what will happen to the current way of doing things. As, I mentioned earlier, it's an evolutionary journey that we already have been accustomed to. So we have to adapt. It's, this into our daily lives, into our professional lives, and then leverage it for the next stage in our careers, in our profession, in our personal lives. Okay, so what am I doing? This chart, what does it depict? I come from a program management background. I come from a business solution background. I come from a technology background. So it's important that there's a method to the madness that we do, right? So everything has to have a flow. Everything has to have a structure this way. It is repeatable. It is comprehensible. People can collaborate in it. So what I've done is to build such a prototype. I have mapped two methodologies onto one. I've taken the data management methodology using data science called crisp and aligned it with the lean development methodologies, which we all are aware of. It's in the software development cycle to come up with this model. However, Okay. Why I'm doing this is I'm training my GPT, which I will talk later on, to recognize that this is the flow of information, this is the flow of the method of interaction that I want it to follow, so that there's a repeatable discipline in the AI model that I'm training to help me, Develop this prototype. we are. We are aware that sometimes a I models and especially in the case of L and M, they can hallucinate. They can. They can go into different directions. I want to give them a precise context of how I want to get things done and use that as an approach. Use that as a method to give me the output that I want. So in the subsequent, slide, I will talk about what else am I doing? So I'm using such approach such methodology. Sorry to say to, saying the model. To realize and recognize this is the flow of information. I wanted to follow. Okay, more or what I'm doing is I'm training the AI model for me for them to develop the software prototype that I require realizing the real world structure that is we are accustomed to and which works. So why? Why reinvent the wheel when we know a structure works like this? So I'm giving a team structure, assuming that there are multiple team members involved into a program, into a project, and they're building this prototype. So I've trained the model to realize and recognize that this is a team structure. They should be able to interact with each other, question each other, and come up with the right software prototype. which is validated by all these different teams that are out there in this. So we all know there are software developers, data scientists, data analysts, domain experts, quality assurance team members, security specialists, DevOps engineers, data engineers, and so forth. So all of that I've used as an information and I've trained an AI model to realize and recognize and build me a software prototype. Based on these conditions. Another thing that I was talking about earlier is, I've adopted the lean approach mapped onto a data management approach to, let the model realize recognize what process to follow, right? So it has to follow certain process to give me the design output or design out for that. I'm looking for now. I've also trained the model to realize that this is the architecture that I'm looking for, right? So there's a data ingestion or data collection, data pre processing, there's a strategy implementation for the algorithmic trading, model that I'm, system that I'm building. There's a backtesting agent that you need to recognize. There's a risk management, criteria that needs to follow and how it should be used. Execute the trades. What are the optimization parameter that needs to consider? And what are the monitoring and logging capabilities? I want to be able to debug and then understand how the application is behaving. So I even trained the AI model to realize the architecture that I'm looking for. Another thing of what I did was, if you some of you are from the computer science background, you will be aware that there are design patterns and everything that we do. So I went a step further and I define the modules that I'm looking for. I listed out what the functionalities I'm looking for. And. What potential design patterns that can be considered. So even though I'm like prompting this to the air model that consider these factors, but I've also asked the model to continuously learn from what prompts that I initiate as a continue as a chain of thought and engagement that I have with our model. So as and when the prototype of walls of the software walls on I engage in certain prompts with that model, it should evolve itself. So even though this is a starting bit, but I've also kept a room for evolution. Now, how did I use these ER tools? Okay. you're all aware there's a chat GPT out there. So chap GPT 4. 0 was the LLM model that I had used. there's a feature within chat GPT where you can build your own custom GPTs. So I built, this custom GPT called quant lab, which allows me to research, Allows me to define certain delivery model methodology and structures it I can do this. Use it for requirements analysis. I can use it for solution architecture design and I can use. I then eventually obviously I will be using it for generating boiler code or template code, which then goes into GitHub Copilot using Visual Studio Code. So in GitHub Copilot, the code that was generated by quant lab. Built using the UPD 4. 0 LLM now and get have copilot the interface. I'm using a visual studio code. The code refactoring happens. The debugging and testing of that code happens. And of course, since I'm building an algorithmic trading system using time series analysis capable process or the structure, I have to identify certain parameters and I have to tune and optimize them to get the desired output. Even though I'm referencing an example of financial services, this time series based data analysis can be done on any kind of a time series data. Okay. Now, what is the approach that I used? And this is what I also used to train my GPT 40 model, right? using GPT 40 LLM, I provided certain training data. I'll talk about that. I did prompt engineering and I trained it using certain prompts, which I will showcase to you. And more importantly, I also provided a contextual input. Now this contextual input is historical data, which is a time series data for it to realize and recognize what it's doing. So instead of it, hallucinating, I gave it specific data points that it should consider for building such a prototype. And more importantly, it's important to give context to the model. Now, what are we doing? We're building an algorithmic trading strategy, right? So what's important is I gave a strategy input. So there are certain research documents out there, which I came across what I felt has a promising, potential. I used that research document and I loaded it into GPT 40 LLM into my custom GPT. And with all that input, And of course, a lot of iteration and a lot of refactoring. I came across a boiler code. This boiler code as a step one eventually went into the step two, which is the Visual Studio code and using GitHub. Go pilot. I refactored and retested that code. And it was able to generate process data. Now, of course, what I did was, if you remember, there was a data ingestion and data collection class or process that I was considering. So the market data would be, Pulled out from the broker data source, and that will be then process and then this process data goes back into the custom GPT, which is based on GPT for LLM. I can do exploratory data analysis of their data analysis and analytical capabilities in GPT for all. I can do visualizations. I can do insights and based on those insights, I can understand what is the behavior of that data. I can go back into Visual Studio code. And do a optimization run or parameterization tuning run and eventually once I'm happy or I'm when I feel it's getting the right output, I can consider this as a prototype, which is based on and I can take this, take it next, next step forward. All right, good. So one other thing to have. I did was I had, as if you remember, there was a system architecture that I put together, right? So I trained the AI model using the system architecture, which is again, not a hard and fast rule, but it was a starting point for the model to realize that this is the output I'm looking for and follow this architecture design that I intend to follow, intend to generate. Okay. All right. Some of the prerequisites for building such an AI model or custom GPD model was, I had to train it a lot for over a couple of days. But it was a week of training, prompt engineering to and fro, eventually to a stage where I felt this is okay. The output of the code is sufficient enough for me to play around with or get the necessary desired results. by the way, this AI model or the custom GPT, I call it QuantLab. So the link you see here, it's publicly available. You can copy paste this, or you can follow this link into your GPT, chat GPT, and it should give you, it should take you to that custom GPT that I've built, and you're free to use that. There's another, thing I need to highlight is the, You need to provide certain specific context to the model. Someone may call it rag. Someone may call it, context. I ended up pulling this research paper, which defined a very promising trading strategy, but of course, this is not a, like a talk on financial advice or on building, commercial grade, training systems. This is just for educational purposes, but using this educational document, I was able to, give a context to the model for it to realize and recognize what are the rules I'm looking for, right? And it does, it did a pretty good job. More importantly, I also provided The model, a starter code from my previous projects, which I knew works fine and there were no bugs in it. So at least it understands that the structure of the code the way I prefer to write or before to read because it's personalized to me. So these inputs were all provided to the quant lab GPD model that I had trained. Now. Let's see. Let's do a walkthrough. All right, enough talk. Let me show you what it looks like. All right. So here we are. This is, all right, let's start from the beginning. Well, a lot of training as you can see. So I provided like a starter code, which is a Python script. I provided certain, I provided that contextual trading strategy. To the model. And of course, it started generating certain outputs. I thought I gave all these different prompts to it. You can see all these different prompts to it. And eventually, after several iterations, and by the way, you can see you can even do graphical visualizations in GPT four. Oh, so this is an amazing tool for that. And, You can do tables. I kept on prompting. It keeps on giving me different results. It started giving me a code that I'm looking for. And with a lot of to and fro, and to and fro, eventually I got to a point where I was satisfied that this is something I can work with. And then what? But before long, before I go to the execution side of it, let me show you what this quant lab is all about, right? So let me just go into this edit functionality of this custom GPT. I'm going to configure, right? So I'm showing you all these things of how, I had actually built this quant lab as a custom GPT. So there is a capability within GPT 4. 0 where you can build your own custom GPT. I'm sure you can find more details around it, but just to give you a summary of how it works, you have to give it instructions. There are certain compositional starters. there are certain knowledge, inputs that I have provided. You can see it over here. And if you remember, these were the diagrams and charts. So I gave it tabular information. I gave it visual information. so it was able to read and comprehend what I was expecting from it. And of course, there were a lot of prompt exercises or prompt that I had provided to it. And eventually ended up giving me the results that I was looking for. All right, That's QuantLab using GPD 4. 0. Now, let's go to Visual Studio. All right, so in Visual Studio, eventually, what I did was, the code that was generated using QuantLab, this was a Jupyter Notebook, by the way, I was a Jupyter Notebook. I could transfer that code over here. Of course, I executed some of the code to make sure there are no bugs and everything, and more importantly, I was using GitHub Copilot to refactor some of the code to give me, let's say, for example, here, I asked it to give me a project structure of how my project should be structured, right? So this is using Python scripts, but eventually I moved on to Jupyter Notebook because of ease of execution. So I had some preferences around that. And, so all might configuration document or configuration files on a configuration is kept separately. I did not want to pull in to the court. So the way I designed was tomorrow, if I interface code or this application with another front end, the JSON file will be able to take all that data inputs that are necessary for it to, take the capture user determine parameters from the user and configuration from the users. And then you can then dynamically update my code or my trading strategy. Okay, so that's how I use JSON over here. yeah, so you can see there are certain code that's been generated. I'm doing certain. And this is a very recursive and refactoring exercise that we, when you get to a stage where the code is sufficient for you, then obviously you can start. But in the field of technology and we're never happy. We always want new versions, new book, new features. So it becomes an. A very fun exercise where you're interacting with a GPT and, you're telling it what you want next, what you want next. And then over time you build things which you like, and then you take it from there. So a lot of to and fro using GPT 4. 0 and of course, good to get up co pilot to eventually get to a stage where I'm happy with the current structure of the code. And it's giving me the desired outcomes and outputs that I'm looking for. And, yes. Let's get back to where we are now. Some of my, for me to wrap up things over here. It's a new world altogether. It's a great world altogether, especially in the field of data analytics in a I, It has been democratized, to a lot of people out there who were a bit skeptical or concerned about using these tools because now these LLM models that are out there makes it so much convenient, easier for a business user, a technical user to collaborate to build Prototype to build applications in a very, natural way off doing things the way human beings do it. And, and also for software engineers, it is an amazing capability, which typically improves their productivity and generating. new innovations, new technologies out there. So I look forward to this, next evolution of AI that is out there. we have to adapt to it. We have to adopt to it. And, we will be in, we, this is a very good, direction that I feel I'm fairly confident that I'm going to be, I'm going to be taking a lot of benefit from it. So go to this quant lab, GPT, custom GPT that I have built. It's free to use. if you have a GPT 4. 0 subscription, you can easily access it. And, if you need any assistance or any guidance, please reach out to me on LinkedIn and I will be able to help you. Thank you so very much all the best with your journeys.
...

Arsalan Sheikh

Treasury & Capital Markets @ ex-Oracle

Arsalan Sheikh's LinkedIn account



Join the community!

Learn for free, join the best tech learning community for a price of a pumpkin latte.

Annual
Monthly
Newsletter
$ 0 /mo

Event notifications, weekly newsletter

Delayed access to all content

Immediate access to Keynotes & Panels

Community
$ 8.34 /mo

Immediate access to all content

Courses, quizes & certificates

Community chats

Join the community (7 day free trial)