Databricks integration for Alchemer workflow

Overview

Databricks is a unified data analytics platform used by organizations to store, process, and analyze large-scale data.

The Alchemer integration with Databricks supports Get data, Update row, Upsert row and Push to table. The Alchemer Databricks integration allows Alchemer to use information from Databricks to get data, personalize workflow paths, enrich routing logic, create merge codes, and update information in Databricks without manual intervention.


Common uses for the Alchemer Databricks integration

  • Personalize emails and workflow steps with information in Databricks
  • Use Databricks information in workflow logic
  • Automate data retrieval and updates between Alchemer and Databricks
  • Reduce manual data entry and record maintenance
  • Keep Databricks records synchronized with responses collected in Alchemer
  • Push workflow data into Databricks tables for analytics and reporting

What can the Alchemer Databricks integration do?

You will need


Setup Alchemer Databricks integration in workflow

Databricks | Get data

You will need:

Configure the action

  1. Open your workflow in Workflow builder.
  2. On the right side, drag and drop the Databricks connection where you want the action to trigger.
  3. In the connection box, click the pencil icon in the top right corner.
  4. Select Databricks | Get row.
  5. Databricks | Authentication: Select an existing authentication or create a new authentication.
  6. Databricks | Choose warehouse: Select the Databricks warehouse you would like to use from the dropdown to write the queries.
  7. Databricks | Choose catalog: Select the Databricks catalog you would like to use from the dropdown.
  8. Databricks | Choose schema: Select the Databricks schema you would like to use from the dropdown.
  9. Databricks | Choose table: Select the Databricks table you would like to use from the dropdown.
  10. Databricks | Find row: Select the Alchemer field containing the lookup value.
  11. Databricks | Get data back: Select the Databricks fields you want returned to the survey and then confirm. Only fields selected here will be returned to the workflow.
  12. Save the action.

Status codes

  • 200: A single row was successfully found
  • 201: Query ran successfully, but no rows were found
  • 202: More than one row was found. The first row is used for the values returned to Alchemer
  • 400: The external integration returned an error

Databricks | Update row

You will need:

Configure the action

  1. Open your workflow in Workflow builder.
  2. On the right side, drag and drop the Databricks connection where you want the action to trigger.
  3. In the connection box, click the pencil icon in the top right corner. 
  4. Select Databricks | Update row.
  5. Databricks | Authentication: Select an existing authentication or create a new authentication.
  6. Databricks | Choose warehouse: Select the Databricks warehouse you would like to use from the dropdown to write the queries.
  7. Databricks | Choose catalog: Select the Databricks catalog you would like to use from the dropdown.
  8. Databricks | Choose schema: Select the Databricks schema you would like to use from the dropdown.
  9. Databricks | Choose table: Select the Databricks table you would like to use from the dropdown.
  10. Databricks | Find row: Select the Alchemer field containing the lookup value.
    Note: All matching rows will be updated, please ensure that your lookup value is unique if you wish to update one row. 
  11. Databricks | Update row: Choose the workflow data you want to use to update a row in your Databricks table.
  12. Save the action.

Status codes

  • 200: Successfully updated row
  • 201: Query ran successfully. No rows updated
  • 202: Multiple rows were found. X row(s) updated
  • 400: The external integration returned an error

Databricks | Upsert row

You will need:

Configure the action

  1. Open your workflow in Workflow builder.
  2. On the right side, drag and drop the Databricks connection where you want the action to trigger.
  3. In the connection box, click the pencil icon in the top right corner. 
  4. Select Databricks | Upsert row.
  5. Databricks | Authentication: Select an existing authentication or create a new authentication.
  6. Databricks | Choose warehouse: Select the Databricks warehouse you would like to use from the dropdown to write the queries.
  7. Databricks | Choose catalog: Select the Databricks catalog you would like to use from the dropdown.
  8. Databricks | Choose schema: Select the Databricks schema you would like to use from the dropdown.
  9. Databricks | Choose table: Select the Databricks table you would like to use from the dropdown.
  10. Databricks | Find row: Select the Alchemer field containing the lookup value.
    Note: All matching rows will be updated, please ensure that your lookup value is unique if you wish to update one row. If no matching rows are found a new row will be created.
  11. Databricks | Upsert row: Choose the survey data you want to use to uspert a row in your Databricks table.
  12. Save the action.

Status codes

  • 200: Successfully inserted row. 
  • 201: Query ran successfully. No rows updated. 
  • 202: Successfully updated X row(s).
  • 400: The external integration returned an error.

Databricks | Push to table

You will need:

Configure the action

  1. Open your workflow in Workflow builder.
  2. On the right side, drag and drop the Databricks connection where you want the action to trigger.
  3. In the connection box, click the pencil icon in the top right corner. 
  4. Select Databricks | Push to table.
  5. Databricks Authentication: Select an existing authentication or create a new authentication.
  6. Databricks | Choose warehouse: Select the Databricks warehouse you would like to use from the dropdown to write the queries.
  7. Databricks | Choose catalog: Select the Databricks catalog you would like to use from the dropdown.
  8. Databricks | Choose schema: Select the Databricks schema you would like to use from the dropdown.
  9. Databricks | Choose table: Select the Databricks table you would like to use from the dropdown.
  10. Databricks | Insert row into table: Choose the survey data you want to use to insert a row into your Databricks table.
  11. Save the action.

Status codes

  • 200: Successfully inserted row. 
  • 400: The external integration returned an error.

Testing and Validation

How to test

  • Trigger the workflow and monitor individual runs in monitor tab in your workflow
    • Click on individual workflow runs to see metadata outputs
  • Confirm the expected update or retrieval occurs in Databricks.
  • Use metadata for verification and debugging.

How to verify results

  • Check the impacted record in Databricks.
  • Ensure retrieved or updated values match expectations.

Monitoring Integration Activity

 Where to find logs

  • Go to Results → Monitor 
  • Select a the integration step you want to inspect.

What logs display

  • Input/Output

Troubleshooting

Authentication issues

  • Incorrect or expired credentials
  • Missing permissions in Databricks

Lookup failures

  • Invalid identifier values
  • No matching records

Mapping errors

  • Unsupported or invalid fields
  • Incorrect formatting

API errors

  • Validation issues
  • Endpoint restrictions

FAQs

What permissions do I need?
Integration Manager in Alchemer and API permissions in Databricks. Databricks requires a service principal with a client ID and secret, along with permissions enabled for that service principal. More details in the authentication how-to guide.
When does the integration run?
When the workflow triggers and reaches the integration step.
Can I use multiple Databricks actions in one workflow?
Yes. Actions can work independently or together.
Why isn’t my data updating?
Check the Action Log for lookup issues, mapping problems, or API errors.
What if I need additional functionality?
Contact Alchemer Support for enhancement requests.
Basic Standard Market Research HR Professional Full Access Reporting
Free Individual Team & Enterprise
Feature Included In