Text-to-text modelling

Text translation

Overview

LLMs have proven their proficiency in translation tasks. The actual performance will heavily depend on the efficacy of the chosen model. Nonetheless we implemented the same capabilites as scikit-llm and offer a locally running translation interface.

Example:

from skollama.models.ollama.text2text.translation import OllamaTranslator
from skllm.datasets import get_translation_dataset

X = get_translation_dataset()
t = OllamaTranslator(model="llama3", output_language="English")
translated_text = t.fit_transform(X)

API Reference

The following API reference only lists the parameters needed for the initialization of the estimator. The remaining methods follow the syntax of a scikit-learn transformer.

OllamaTranslator

from skollama.models.ollama.text2text.translation import OllamaTranslator
ParameterTypeDescription
modelstr, optionalModel to use, by default "llama3".
hoststr, optionalOllama host to connect to, by default "http://localhost:11434".
optionsdict, optionalAdditional options to pass to the Ollama API, by default None.
output_languagestr, optionalLanguage to translate to, by default "English".
hoststr, optionalOllama host to connect to, by default "http://localhost:11434".
optionsdict, optionalAdditional options to pass to the Ollama API, by default None.
Previous
Text summarization