Skip to content Skip to footer

Our MoE model: KafkaLM 8x7B German

We’re excited to present KafkaLM-8x7b-German-V0.1: a MoE (Mixture of Experts) model based on Mistral AI’s Mixtral 8x7B, further fine-tuned on a curated set of high-quality open-source instruction datasets (translated from English into German). This model is part of our KafkaLM series, aimed at delivering proficient yet creative language models that can tackle diverse everyday and business-related use cases in German.


Why KafkaLM?

We’re excited to present KafkaLM-8x7b-German-V0.1: a MoE (Mixture of Experts) model based on Mistral AI’s Mixtral 8x7B, further fine-tuned on a curated set of high-quality open-source instruction datasets (translated from English into German). This model is part of our KafkaLM series, aimed at delivering proficient yet creative language models that can tackle diverse everyday and business-related use cases in German.


Key Details

Built with an 8k filtered subset of the seedboxai/multitask_german_examples_32k dataset, KafkaLM-8x7b follows a straightforward prompt template featuring system, user, and assistant blocks. This structure guarantees reliable and contextually aware responses, whether you’re drafting an email or brainstorming new ideas.


Getting Started

Leveraging the Hugging Face ecosystem, getting started is a breeze. Simply load the model, configure your prompt, and enjoy the benefits of a robust German-language LLM capable of handling diverse tasks while respecting hardware constraints—thanks to 4-bit inference support.

Try out KafkaLM-8x7b-German to experience a fresh approach to German NLP that merges precision, creativity, and user-friendly design!