Skip to content

[BUG] Spark Environment: Native Execution Engine setting does not get deployed #776

@frithjofv

Description

@frithjofv

Library Version

0.1.34

Python Version

3.11 (CPython 3.11.14)

Operating System

Linux

Authentication Method

Service principal (federated credential)

What is the problem?

Native Execution Engine setting does not get deployed with Environment artifact.

In the source code definition of the Environment artifact's Sparkcompute.yml, it looks like this:

enable_native_execution_engine: true driver_cores: 4 driver_memory: 28g executor_cores: 4 executor_memory: 28g dynamic_executor_allocation: enabled: true min_executors: 1 max_executors: 1 spark_conf: spark.sql.ansi.enabled: true spark.fabric.resourceProfile: readHeavyForPBI spark.databricks.delta.autoCompact.enabled: true spark.microsoft.delta.snapshot.driverMode.enabled: true spark.microsoft.delta.optimize.fast.enabled: false runtime_version: 1.3

fabric-cicd debug shows the following:
`[debug] 22:30:26 -
URL: https://api.powerbi.com/v1/workspaces/[REDACTED_BY_ME]/environments/[REDACTED_BY_ME]/staging/sparkcompute?beta=False
Method: PATCH
Request Body:
{
"enableNativeExecutionEngine": true,
"driverCores": 4,
"driverMemory": "28g",
"executorCores": 4,
"executorMemory": "28g",
"dynamicExecutorAllocation": {
"enabled": true,
"minExecutors": 1,
"maxExecutors": 1
},
"sparkProperties": [
{
"key": "spark.sql.ansi.enabled",
"value": true
},
{
"key": "spark.fabric.resourceProfile",
"value": "readHeavyForPBI"
},
{
"key": "spark.databricks.delta.autoCompact.enabled",
"value": true
},
{
"key": "spark.microsoft.delta.snapshot.driverMode.enabled",
"value": true
},
{
"key": "spark.microsoft.delta.optimize.fast.enabled",
"value": false
}
],
"runtimeVersion": 1.3
}
Response Status: 200

(...)
Response Body:
{
"instancePool": {
"name": "Starter Pool",
"type": "Workspace",
"id": "00000000-0000-0000-0000-000000000000"
},
"driverCores": 4,
"driverMemory": "28g",
"executorCores": 4,
"executorMemory": "28g",
"dynamicExecutorAllocation": {
"enabled": true,
"minExecutors": 1,
"maxExecutors": 1
},
"sparkProperties": [
{
"key": "spark.sql.ansi.enabled",
"value": "True"
},
{
"key": "spark.fabric.resourceProfile",
"value": "readHeavyForPBI"
},
{
"key": "spark.databricks.delta.autoCompact.enabled",
"value": "True"
},
{
"key": "spark.microsoft.delta.snapshot.driverMode.enabled",
"value": "True"
},
{
"key": "spark.microsoft.delta.optimize.fast.enabled",
"value": "False"
}
],
"runtimeVersion": "1.3"
}

[debug] 22:30:26 - Request completed in 0.8633520603179932 seconds
22:30:26 - Updated Spark Settings`

Steps to reproduce

  1. Create an environment in feature workspace.
  2. In the environment's UI (Spark compute -> Acceleration), check the 'Enable native execution engine' box.
  3. Publish the environment.
  4. Commit to feature branch, using Fabric workspace Git integration.
  5. In GitHub, merge into ppe branch.
  6. Deploy to ppe workspace using fabric-cicd.

Expected behavior

I expected the 'Enable native execution engine' property in the deployed Spark environment in ppe workspace to be checked (true).

Actual behavior

The 'Enable native execution engine' property in the deployed Spark environment in ppe workspace was unchecked.

Is there an ideal solution?

No response

Additional context, screenshots, logs, error output, etc

To debug further, I created a temporary branch from the ppe workspace and synced the contents from the ppe workspace to GitHub.

The Sparkcompute.yml then looked like this:

enable_native_execution_engine: false driver_cores: 4 driver_memory: 28g executor_cores: 4 executor_memory: 28g dynamic_executor_allocation: enabled: true min_executors: 1 max_executors: 1 spark_conf: spark.sql.ansi.enabled: True spark.fabric.resourceProfile: readHeavyForPBI spark.databricks.delta.autoCompact.enabled: True spark.microsoft.delta.snapshot.driverMode.enabled: True spark.microsoft.delta.optimize.fast.enabled: False runtime_version: 1.3

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions