Forums

Any plan to support ollama and deepskin?

The subject line says it all. It seems that these tech are more affordable for the 'little guy'.

I'm guessing you mean DeepSeek? We're unlikely to offer plans supporting LLM inference directly running on PythonAnywhere -- it's very CPU-intensive, or more often GPU-intensive, so unfortunately it's something that's expensive to provide and we'd have to charge commensurately. You should be able to call the DeepSeek API from PythonAnywhere, though, and I believe they have much lower prices than (say) OpenAI or Anthropic.

Thanks for the reply. And obviously, you were guessing right, I meant DeepSeek.