Couple of technical questions (LoRAs, TIs and AWS)

Viewing 1 reply thread
  • Author
    Posts
    • #13699
      AvatarMarko Zegarac
      Participant

        Hey Andrew,

        I don’t want to spam the forum with my questions, so I will put both of my questions here.

        1. Textual Inversions vs LoRA, Which one is better? And by ‘better’, I mean produce better results? Assuming that:
        • Usage is on Stable Diffusion 1.5 (or its derivatives)
        • Computational resources are not a problem for inference or training.
        • File size does not matter, only the results.

        With all that equal, which of the two is a superior technique and why?

        1. Amazon Web Services. Which EC2 instance is the best for running Stable Diffusion remotely? Ideally, something that would be able to run Stable Diffusion 3 (once it comes out). Or, would you avoid EC2 entirely? If so, what service (from AWS ecosystem) would you recommend?

        Thank you!

        Marko

      • #13701
        AvatarAndrew
        Keymaster

          Hi!

          1. LoRA is better than TI, in the sense that it can change the model more.

          2. Typically instances with T4 or V100 GPU. You can stick with 1 GPU for inference. AWS is not the most cost effective. You can consider Google Colab if you are ok with the interactive interface. They offer the same GPUs at a much lower rate.

      Viewing 1 reply thread
      • You must be logged in to reply to this topic.