Thats a good question. I would recommend MCP for the bulk of 'chatty' soft data to keep the database clean. However, you should selectively ingest 'high value' data into ClickHouse for vector search.
For e.g. you wouldn't ingest every 'good morning' message. But once an incident is resolved, you could ETL specific threads (filtering out noise) and the resulting RCA into ClickHouse as a vectorized document. That way, the copilot can recall the solution 6 months later without depending on Slack.
The interesting part is that only one of them is software only. I get it is economists but I guess this is also telling that while AI is really talk of town in silicon valley, long term it is one, minor if I may, part of the future.
We are a service to help brands navigate the new world of AI agents. Currently focused on helping them increase visibility in AI search but we plan to go beyond that.
> Unlike most prompt injections, the researchers said Shadow Leak executed on OpenAI’s cloud infrastructure and leaked data directly from there. This makes it invisible to standard cyber defenses, they wrote.
For e.g. you wouldn't ingest every 'good morning' message. But once an incident is resolved, you could ETL specific threads (filtering out noise) and the resulting RCA into ClickHouse as a vectorized document. That way, the copilot can recall the solution 6 months later without depending on Slack.