Final words
Get the project source code below, and follow along with the lesson material.
Download Project Source CodeTo set up the project on your local machine, please follow the directions provided in the README.md
file. If you run into any issues with running the project source code, then feel free to reach out to the author in the course's Discord channel.
This lesson preview is part of the Responsive LLM Applications with Server-Sent Events course and can be unlocked immediately with a \newline Pro subscription or a single-time purchase. Already have access to this course? Log in here.
Get unlimited access to Responsive LLM Applications with Server-Sent Events, plus 70+ \newline books, guides and courses with the \newline Pro subscription.
[00:00 - 01:21] Congratulations! You have finished this course and learned how to build a near- high-driven replication. We've learned how to set up a font we react, how to create a backend in fast API launching, how to leverage with server-centive and protocol to stream event over network, how to use a vector database to manipulate CMT Victor. We've also created the most common and powerful use case such as the completion use case, the chat use case and the retrieval of augmented generation use case. Before we end this course, I wanted to mention briefly the subject of deployment. As you can see, this demo app was deployed. The front end was deployed on Vercel. The backend was containerized using Docker and deploy on Fly.io. If you want more detail, go to the "renmi" of the code. The deployment is documented in detail. Lastly, there is a dedicated Discord channel for this course. You feel free to join it, to share anything you want with us, so it's an epic image to build thanks to this course. What you learned, if you want to report a bug or a mistake or anything you want to say. Thanks a lot, I hope this course free let you build a product of your dreams. Bye!