We Gave Our Browser Agent a 3MB Data Warehouse
37 points by shardullavekar 5 days ago | 5 comments
  • kbdiaz 5 days ago |
    Yeah, this is the natural next step. We're currently combining LLMs and compute -- mostly in the form of giving agents tools, then terminal access and now most recently sandboxes. The most logical next step is to give them specialized compute engines and frameworks for their tasks.

    I've been building SQL agents recently, and nothing is better than just giving it access to Trino.

  • charcircuit 5 days ago |
    >You import heavy libraries just to do fuzzy string matching.

    So instead of a heavy library for string matching you use a heavy library for a whole SQL engine which includes fuzzy string matching?

    • odo1242 4 days ago |
      Instead of having the AI write JavaScript code that does fuzzy string matching operations, they have it write SQL.
  • smithclay 4 days ago |
    This is an emerging pattern that’s surprisingly powerful: thick clients that embed wasm query engines (pglite, duckdb) and do powerful analytics (with or without AI agents writing the queries).

    Below are two examples using duckdb under the hood for similar purposes. Like the author, excited for this type of architecture making semi-advanced analytics more attainable if you’re not a data engineer.

    - https://pandas-ai.com/ - https://marimo.io/

  • kristianp 3 days ago |
    There's no explanation of what 3MB in the title means.