Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature Request: Support Airllm #10202

Open
4 tasks done
kbocock-krg opened this issue Nov 7, 2024 · 0 comments
Open
4 tasks done

Feature Request: Support Airllm #10202

kbocock-krg opened this issue Nov 7, 2024 · 0 comments
Labels
enhancement New feature or request

Comments

@kbocock-krg
Copy link

Prerequisites

  • I am running the latest code. Mention the version if possible as well.
  • I carefully followed the README.md.
  • I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
  • I reviewed the Discussions, and have a new and useful enhancement to share.

Feature Description

Please add support through airllm for 4 and 8gb gpus. https://github.com/lyogavin/airllm

Motivation

major reduction in barrier to entry with large llms

Possible Implementation

llama.cpp server and python lib.

@kbocock-krg kbocock-krg added the enhancement New feature or request label Nov 7, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant