This is an automated archive made by the Lemmit Bot.

The original was posted on /r/stablediffusion by /u/jmellin on 2024-10-13 13:59:40+00:00.


Hey everyone!

I’m excited to share the launch of CogVideoX-LoRAs, a dedicated GitHub repository that serves as a central hub for all LoRA (Low-Rank Adaptation) models created for CogVideoX.

With the rise of community-based fine-tuned weights for CogVideoX, I quickly realized that it would be challenging to find all these new models being created. The need for a unified place to collect and share LoRA models tailored for CogVideoX was evident. With the growing demand for customized video generation, I wanted to create a space where users, developers, and researchers can easily access, contribute to, and collaborate on various LoRA models.

What you can find

  • (To-be) Comprehensive Collection: A growing list of all available LoRAs with direct links to their Hugging Face repositories.
  • Community Contributions: An open invitation for everyone to contribute their models or improvements, fostering a collaborative environment.
  • Easy Navigation: Clear organization and categorization of LoRA models to make discovery and usage straightforward.

What’s to come

  • Usage Examples: Code snippets and documentation to help you get started quickly with the models.
  • Training Examples: Code snippets and documentation to assist you in training your own weights based on your hardware and environment.

Check it out!

You can explore the repository here: CogVideoX-LoRAs GitHub Repo

List of currently available LoRAs is found here: LoRA Models

I welcome any feedback, suggestions, or contributions! If you have a LoRA model you’d like to add or any ideas for improvement, feel free to open a pull request or leave a comment. Let’s build a robust collection together!

Thanks for your interest, and I look forward to seeing the amazing LoRAs you all create! 🚀

Update:

Added a python script to simplify contributing to the list and keep a structured table without the need for manual edit. The script asks for two inputs.

  • HF-link to the repository
  • Short description of lora, not more than 250 characters.

Fork the repo, run python file and create a PR - Done!