Hi together 👋🏻
First of all thanks for making this nice Ollama client. I love the clean interface!
Since at the moment it is not possible to set another Ollama server than localhost (and my Ollama server is on another host), I set up a SSH tunnel to open the port 11434 on my localhost and tunnel to the same on the Ollama server.
It kind of works, but only for a short while. The selection of the model is always fine, but the chat will stop with a...
Response could not be decoded because of error:
The data couldn't be read because it isn't in the correct format.
... after some chunks of the answer were presented.
So I downloaded the XCode project and set some breakpoints.
I was able to get the error and this is the problem:
Printing description of result.failure.responseSerializationFailed.reason.decodingFailed.error:
▿ DecodingError
▿ dataCorrupted : Context
- codingPath : 0 elements
- debugDescription : "The given data was not valid JSON."
▿ underlyingError : Optional<Error>
- some : Error Domain=NSCocoaErrorDomain Code=3840 "Unexpected character '{' after top-level value around line 2, column 1." UserInfo={NSJSONSerializationErrorIndex=103, NSDebugDescription=Unexpected character '{' after top-level value around line 2, column 1.}
A bit strange, since a SSH tunnel should not change the JSON of an HTTP response.
Anyhow, since you refrained from making the Ollama server configurable until now (and this should be a super easy thing to do), may it be that you also faced similar problems?
I read here...
https://stephencowchau.medium.com/ollama-context-at-generate-api-output-what-are-those-numbers-b8cbff140d95
... that Ollama is opening additional ports, which are not port 11434.
I wonder if this could play a role.
Anyhow I think it shouldn't, since they should not be relevant for the communication between the client and the server.
Any thoughts?
Thanks a bunch in advance and keep up the great work!
Best regards
Stephan