Skip to content

FR: Integrated Local Download of Ollama Models #33

@LeonelRFF

Description

@LeonelRFF

Is your feature request related to a problem? Please describe.

Hello, The drawback I find is that to use the models locally, you need to use another tool such as Termux. From there, you can install Ollama and then run it in Swift-Chat.

Describe the solution you'd like

would it be possible to download models from the app to run them locally? I mean, in the Ollama section, you could write the name of the model to download it from here.

Describe alternatives you've considered

Something similar to what I'm referring to is in the project here github.com/sunshine0523/OllamaServer, where it is not necessary to use Termux, you can simply do all this from the application.

Additional context

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions