Last updated: Oct 31, 2024
HEALTH TECH VENDOR
IMPLEMENTATION
Closed beta feature
For cloud connectivity with Redox, you decide which cloud provider and cloud product(s) to use. Then, you'll need to create a cloud destination in your Redox organization.
You'll need to perform some steps in your cloud product(s) and some in Redox. You can perform Redox setup in our dashboard or with the Redox Platform API.
This article is for this combination of cloud products:
- Microsoft Azure (with Data Lake)
- Databricks
- Navigate to the Microsoft Azure dashboard and log in. Review Azure's quickstart guide to get started.
- Create an application through Azure Entra. Review Azure's help article. This is where you'll get a client ID and tenant ID, which you'll need for Redox setup later.
- Create a new secret for your application. This is where you'll get client secret value, which you'll need for Redox setup later.
- Create a new storage account. Set the primary service to the Data Lake Storage option.
- Add a new container. You'll need the name of the container for Redox setup later.
- Assign the Blob Data Owner role to the application you created in step #2.
Next, create a cloud destination in your Redox organization. This destination will be where your data is pushed to.
- For the select destination step, follow the instructions for creating a cloud destination.
- From the Product type field, select Databricks.
- For the configure destination step, populate these fields. Then click the Next button.
- Storage account name: Enter the name of the storage account you created in Azure. Locate this value in the Azure dashboard.
- Container name: Enter the name of the container you created in Azure. Locate this value in the Azure container configuration.
- File name prefix (optional): Enter any prefix you want prepended to new files when they're created in the Data Lake container. Add / to put the files in a subdirectory. For example, redox/ puts all the files in the redox directory.
- For the auth credential step, either a drop-down list of existing auth credentials displays or a new auth credential form opens. Learn how to create an auth credential for OAuth 2.0 2-legged.Existing or new auth credential
- For the verify step, follow the instructions for verifying a destination.
- In your terminal, prepare the /v1/authcredentials request.
- Specify these values in the request.
- Locate the clientId and clientSecret value in the Microsoft Azure dashboard.Example: Create auth credential for Azure + Databricksjson
- You should get a successful response with details for the new auth credential.
- In your terminal, prepare the /v1/environments/{environmentId}/destinations request.
- Specify these values in the request.
- Set authCredential to the auth credential ID from the response you received in step #4.
- Populate cloudProviderSettings with the settings below (adjust values based on the storage account and container setup in Azure configuration).
- The fileNamePrefix is optional, and if added, it gets prepended to the created file path in the Data Lake container.Example: Values for Azure + Databricks cloudProviderSettingsjson
- You should get a successful response with details for the new destination for Microsoft Azure and Databricks.
- Your new destination will now be able to receive messages. We push data to the Data Lake storage account as a JSON file, which is ingested into Microsoft Azure.