How does this export work ?
- The model will be exported to a persistent Snowflake function.
- It can be used by any Snowflake user in order to score records within Snowflake.
Not available: {{cantExportToSnowflake ? 'Export to Snowflake is not available in Visual Analyses. Deploy your model to the Flow first' : model.javaExportCompatibility.reason}}
How does this export work ?
- The model will be exported for use in Java code.
- It can be used in any Java program in order to score records, outside of Dataiku.
Not available: {{model.javaExportCompatibility.reason}}
How does this export work ?
- The model will be exported for use in Python code.
- It can be used in any Python program in order to score records, outside of Dataiku.
Not available: {{model.pythonCompatibility.reason}}
How does this export work ?
- The model will be exported as the original MLflow Model. Some extra DSS files may appear in the zip but they won't have any incidence.
- The model can be imported in any MLflow-compatible system.
- The model will be exported as an MLflow Model using "dss" flavor of MLflow. This flavor uses the dataiku-scoring python package to load the model.
- The model can be imported in any MLflow-compatible system.
The generated MLflow model is also compatible with the "python_function" flavor of MLflow.
Not available: {{model.pythonCompatibility.reason}}
How does this export work ?
- The model will be exported to the MLflow format
- Then registered as an artifact of a new run of the specified experiment
- Finally added as a new version of a registered model
The generated MLflow model is compatible with the "python_function" flavor of MLflow.
Not available: {{model.pythonCompatibility.reason}}
How does this export work ?
- The model will be exported as a PMML file.
- The generated PMML file can be imported in any PMML-compatible scoring system.
Not available: {{model.pmmlCompatibility.reason}}
There are no export formats compatible with this model.