



Automation
Prompt engineering
Context
The rise in popularity of LLMs among the general public has prompted the technical team to design innovative AI-based features.
Problem
LLM APIs generate their responses in real time, word by word. To support this functionality, it was essential to develop a custom microservice from scratch.
Solution
Deploying a microservice that supports real-time data streams with Socket.io
Integration of the OpenAI's ChatGPT API
section_works_2_2_solution
section_works_2_3_solution
Impact
- The implementation has enabled the resolution of problems such as file conversion and text synthesis.


