Hacker News new | past | comments | ask | show | jobs | submit login

It’s not just about passing prompts — in production systems like Ramp’s, they had to build a custom ETL pipeline to process data from their endpoints, and host a separate database to serve structured transaction data into the LLM context window effectively.

We’ve seen similar pre-processing strategies in many efficient LLM-integrated APIs — whether it’s GraphQL shaping data precisely, SQL transformations for LLM compatibility, or LLM-assisted data shaping like Exa does for Search.

https://engineering.ramp.com/ramp-mcp

PS: When building agents, prompt and context management becomes a real bottleneck. You often need to juggle dynamic prompts, tool descriptions, and task-specific data — all without blowing the context window or inducing hallucinations. MCP servers help solve this by acting as a "plug-and-play" prompt loader — dynamically fetching task-relevant prompts or tool wrappers just-in-time. This leads to more efficient tool selection, reduced prompt bloat, and better overall reasoning for agent workflows.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: