top of page
Blogs


Master the Art of Data Streamlining with Pub/Sub Pipelines
Data pipeline Creating a pipeline that requires sending some events to be processed in the background keeping scalability in mind is one of the key requirements for building applications today. In order to process heavy/time consuming jobs in your current application it’s generally better to separate it from your core application, and one way to achieve is to use a pub/sub architecture. The idea here is as follows: Maintain a queue of messages A publisher pushes messages to t
Rahul Kumar
2 min read


Unleashing Efficiency: Batch Processing with Large Language Models
We at Newtuple build applications which leverage LLMs (Large language models) like OpenAI, Gemini etc which has become an exciting...
Rahul Kumar
4 min read
bottom of page