Show HN: Run LLaMa2 on the Browser with Ggml.js https://ift.tt/XhlNZpT

Show HN: Run LLaMa2 on the Browser with Ggml.js You can now build serverless AI inference web application with ggml.js's LM backends. https://ift.tt/IQM9n5S August 13, 2023 at 01:22AM

Comments

Popular posts from this blog

Show HN: Samsar One Version One is now live https://ift.tt/THDkKSc

Show HN: NewsCatcher's Hyperlocal News API – Granular City-Level News Feeds https://ift.tt/9UNMDR7

Show HN: Krixik – Easily sequence small/specialized AI models (pip-installable) https://ift.tt/9nDHzUF