← Back to home

Connecting Azure Machine Learning to Fabric OneLake

Here is a workaround to access data from OneLake in Azure ML.

It uses Python and Notebooks to read data from OneLake.

  1. Create a compute instance Create a compute instance
  2. From Notebooks create a new .py file or .ipynb. Use the following code: https://learn.microsoft.com/en-us/fabric/onelake/onelake-access-python#sample Create notebook file
  3. You can get "myWorkspace" and "myLakeHose" from the Lakehouse properties URL. The URL is formatted as follows: https://onelake.dfs.fabric.microsoft.com/myWorkspace/myLakehouse/Files Lakehouse properties Lakehouse URL format
  4. To run the code in Notebooks, open a terminal and attach it to your compute. Open terminal
  5. In the command line. install Azure storage and Azure identity packages:$pip install azure-storage-file-datalake azure-identity
  6. Navigate to the location of your Python file: For example: $cd ~/cloudfiles/code/Users/admin
  7. The code uses Default credentials of the logged in user from the command line. Login using using: $az login --identity
  8. Make sure the logged in user has access to the files in the Fabric Lakehouse
  9. Run the Python file to list the files in the OneLake/Files directory: $python ./onelake.py
  10. If it succeeds, you should see the files listed: Files listed successfully