LLM Mesh

Configuration

These are the settings for the features and capabilities in Dataiku allowing you to leverage Large Language Models (LLM) for building your AI-powered applications. These include:

  • Prompt studios
  • Prompt recipes
  • Text classification and summarization recipes
  • Retrieval-augmented models and embedding recipes
  • LLM connections

Retrieval augmented generation

It is recommended to use the dedicated internal code env for retrieval augmented generation. The internal code env for retrieval augmented generation has not been installed yet. Contact your admin to install it. Install it in the Administration Settings. The container image is not built for this code env.

PII Detection

It is recommended to use the dedicated internal code env for PII detection. The internal code env for PII detection has not been installed yet. Contact your admin to install it. Install it in the Administration Settings. The container image is not built for this code env.

Hugging Face

Applies per process.
In seconds. Works as a TTL if there is only 1 instance.

Evaluation recipes

See documentation for required packages

Document extraction recipes

Use this connection by default for new managed output folders, even if a better contextual connection could be selected (for example, same as input)
It is recommended to use the dedicated internal code env for document extraction. The internal code env for document extraction has not been installed yet. Contact your admin to install it. Install it in the Administration Settings. The container image is not built for this code env.

Agents

Defines who is authorized to create, update and delete local MCP client tools. Using local MCP client tools is always allowed for all users.

Auditing

These settings control auditing for cases when there is no associated connection, notably for agents

Write completion traces to audit log. Strongly increases audit log size.

Trace Explorer

Project
Web app

Cost control

These settings help you manage your LLM costs effectively. You can set quotas, track spending, configure email notifications for threshold alerts and decide whether to block queries once the quota is exhausted.

For each quota, you can configure the LLM providers, DSS projects and users they apply to. The Fallback Quota applies to queries not matching any other quota.

You have unsaved changes, they will not be effective until they are saved.

Rate limiting