Philipp Zagar

Results 22 comments of Philipp Zagar

We'll handle this in the context of the "Smart-Stack" feature, which should make it way easier to display only the currently important information on the watch :)

First we need some notifications to begin with, as already discussed ;)

@cdillard Amazing work, let me know if I can be of any help!

Sadly, in its current state, CoreML is not optimized to run LLMs and therefore is way too slow for local LLM execution. Currently, SpeziLLM provides local inference functionality via llama.cpp,...

CoreML doesn't seem to be optimized for LLM inference in the near future, Apple focuses on [MLX](https://github.com/ml-explore/mlx) for LLM execution 🚀 Therefore, we're closing the issue for now, might be...

SpeziLLM now allows to pull Huggingface-hosted models for local execution since we switched to MLX from llama.cpp in https://github.com/StanfordSpezi/SpeziLLM/pull/73 🚀

Hey @czechboy0, thanks for the quick answer! Yep, your minimal example is correct! What I could envision is a (simplified) generated code like: ```swift struct Object: Codable, Hashable, Sendable {...

> Pragmatically, the easiest way to remove the warnings is to pre-process the OpenAPI doc and remove the `oneOf`. You'll then still end up with a usable generated struct with...

I agree with @czechboy0 here, I think an enum representing the valid cases would be the better option here. Still, we have to ensure backwards compatibility here.

Issue was raised in https://github.com/orgs/StanfordSpezi/discussions/69, I formalized it as a feature request in the SpeziLLM repo.