The effects of prompt caching on Agentic coding

Prompt caching is a feature that Anthropic first offered on their API in 2024. It adds a cache for the tokens used Why it matters Without prompt caching every token in and out of the API must be processed and paid for in full. This is bad for your wallet, bad for the LLM hosting providers bottom line and bad for the environment. This is especially important when it comes to Agentic coding, where there are a lot of tokens in/out and important - a lot of token reuse, which makes it a perfect use case for prompt caching. ...

March 20, 2025 ยท Sam McLeod

Agentic Coding - Live Demo / Brownbag

Apologies for the video quality, Google Meet/Hangouts records in very low resolution and bitrate. Links mentioned in the video: Cline Roo Code (Cline fork with some experimental features) MCP https://modelcontextprotocol.io/introduction The package-version MCP server I created: https://github.com/sammcj/mcp-package-version https://smithery.ai (index of MCP servers) https://mcp.so(index of MCP servers) https://glama.ai/mcp/servers (index of MCP servers)

February 7, 2025 ยท Sam McLeod

Fun with Makefiles - Dynamic Menu Generation

November 14, 2023 ยท 2 min ยท 257 words ยท Sam McLeod