Last updated: Apr 9, 2026
IMPLEMENTATION
HEALTH TECH VENDOR
To populate your Microsoft Azure repository with healthcare data from an EHR system via Redox (and then to optionally feed that data into Fabric or Snowflake for analytics), you must configure a specific Redox cloud destination. A Redox destination represents where a message is delivered (e.g., like the address in the “To” line of an email header). Learn more about connecting Redox to your cloud repository.
You’ll need to perform some steps in your cloud product(s) and some in Redox. You can perform Redox setup in our dashboard or with the Redox Platform API.
- Establish a connection with your preferred EHR system. Learn how to request a connection.
- Decide which combination of cloud products to use. Redox currently supports any of these combinations with your GCP cloud repository:
- Microsoft Azure + FabricOneLake and Fabric
- Microsoft Azure + Snowflake
- Complete your Azure (and any other cloud product) configuration before creating your Redox destination. Save any downloads with secret values, since you’ll need to enter some of these details into the Redox dashboard.
- Grant access to Redox from Azure (and any other cloud product) to authorize Redox to push data to your cloud repository.
- Navigate to the Microsoft Azure dashboard and log in. Review Azure’s quickstart guide to get started.
- Create an application through Microsoft Entra ID (formerly Azure Active Directory). Review Azure’s help article. This is where you’ll get a client ID and tenant ID, which you’ll need for Redox setup later.
- Create a new secret for your application. This is where you’ll get client secret value, which you’ll need for Redox setup later.
- Navigate to the Fabric dashboard and log in.
- Create a new workspace, or open an existing workspace you want to use for integrating with Redox.
- Within the workspace, create a lakehouse, or open an existing lakehouse you want to use for hosting data.
- Add Contributor permissions for the application you previously created in the Microsoft Azure console.
Next, create a cloud destination in your Redox organization. This destination will be where your data is pushed to.
- For the select destination step, follow the instructions for creating a cloud destination.
- From the Product type field, select Databricks or Snowflake if you’re using one of those cloud products with Fabric. Your Fabric settings will be ingested with the additional cloud product. Select Fabric if you’re not also using Databricks or Snowflake.
- For the configure destination step, populate these fields. Then click the Next button.
- Workspace name: Enter the name of the workspace you created in Fabric. Locate this value in the Fabric dashboard.
- Lakehouse name: Enter the name of the lakehouse you created in Fabric. Locate this value in the Fabric workspace configuration.
- File name prefix (optional): Enter any prefix you want prepended to new files when they’re created in the Fabric lakehouse. Add / to put the files in a subdirectory. For example, redox/ puts all the files in the redox directory.
- For the auth credential step, either a drop-down list of existing auth credentials displays or a new auth credential form opens. Learn how to create an auth credential for OAuth 2.0 Two-legged.Token endpoint URLExisting or new auth credential
- For the verify step, follow the instructions for verifying a destination.
- In your terminal, prepare the /v1/authcredentials request with these values:
- Locate the clientId and clientSecret value in the Microsoft Azure dashboard.
- The scope should be the same whether you use Fabric or Snowflake.Example: Create auth credential for Azure + Fabricjson
- You should get a successful 200 response and a payload populated with the details of the new auth credential.
- In your terminal, prepare the /v1/environments/{environmentId}/destinations request with these values:
- Set authCredential to the auth credential ID from the response you received in step #4.
- Populate cloudProviderSettings with the settings below (adjust values based on the workspace and lakehouse setup in the Fabric configuration).
- Enter the productId based on the additional cloud product you chose:
- Fabric: fabric
- Snowflake: snowflake
- The fileNamePrefix is optional, and if added, it gets prepended to the created file path in the Fabric lakehouse. You can also append / after the prefix name to indicate a directory path.Example: Values for Azure + Fabric or Snowflake cloudProviderSettingsjson
- You should get a successful 200 response with a payload populated with the details of the new Azure cloud destination. Specifically, the verified status of the destination should be set to true.
- Your new destination will now be able to receive messages. We push data to the lakehouse storage as a JSON file, which is ingested into Microsoft Azure.