A simpler answer might be llamafile if you’re using Mac or Linux.
If you’re on windows you’re limited to some smaller LLMs without some work. In my experience the smaller LLMs are still pretty good as chat bots so they might translate well.
A simpler answer might be llamafile if you’re using Mac or Linux.
If you’re on windows you’re limited to some smaller LLMs without some work. In my experience the smaller LLMs are still pretty good as chat bots so they might translate well.
I love duckDB, my usual workflow is:
Then duckdb treats the directory just like a databese that you can build indexes on, and since they’re parquet files they’re hella small and have static typing. It was pretty fast and efficient before, and duckdb has really sped up my data wrangling and analysis a ton.
Should include “has duckplyr” which is bad ass in the few weeks I’ve been using it.
Llamafile is a great way to get use an LLM locally. Inference is incredibly fast on my ARM macbook and rtx 4060ti, its okay on my Intel laptop running Ubuntu.