cancel
Showing results for 
Search instead for 
Did you mean: 

Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started

Reply
reded1
Regular Visitor

DataLake to DataWarehouse

Hi,

I am trying to transfer files from lakehouse to warehouse using notebook

how can we do that? i used abfss to read the files from lakehouse and used abfss as path for warehouse but I am getting error

lakehouse files to warehouse tables

Thanks,
Red

2 ACCEPTED SOLUTIONS

You could perhaps use Dataflows Gen2 with Append method.

Pattern to incrementally amass data with Dataflow Gen2 - Microsoft Fabric | Microsoft Learn

This should work similarly both for Lakehouse or Warehouse.

 

Or just use a cross-database query to get the Lakehouse data into your Warehouse, as shown in the previous comment (however it doesn't work for Lakehouse files, only Lakehouse tables).

 

 

I don't think Notebook can write data to Warehouse. So I don't think what you are asking is directly possible.
I think you will need to use some other tool than Notebook (Dataflows Gen2, T-SQL script or stored procedure, data pipeline, etc.).
Or you can use Notebook to enter data into a Lakehouse table, and then use cross-database queries to make the data available in the Warehouse.

 

If you want to use Notebook to load data, why not just use Lakehouse instead of Warehouse?  I am curious.

 

If someone knows how to use Notebook to write data to Warehouse, I am interested to know.

View solution in original post

Right now we can't write to a Warehouse without going through the Warehouse SQL Endpoint (we can read via Spark as you know).  So the only way to work with data is using a connectivit method like pyodbc and connecting to the Warehouse SQL Endpoint in the Notebook.

 

I'll dig up an example I did a whle ago to show this.

 

Connectivity to data warehousing - Microsoft Fabric | Microsoft Learn

View solution in original post

5 REPLIES 5
KevinChant
Resolver I
Resolver I

Some good answers here but my question is why do you want to move data from a Lakehouse to a Warehouse?

 

Would shortcuts or views be an alternative option or are you hoping to use T-SQL permissions?

v-huijiey-msft
Community Support
Community Support

Hi @reded1 ,

 

Transferring files from Lakehouse to Warehouse using a notebook can be a little difficult, I have a more convenient solution here.

 

Once you're in the warehouse in Fabric, select the place I've highlighted.

vhuijieymsft_0-1720747500700.png

 

Select the Lakehouse you want to connect to.

vhuijieymsft_1-1720747500711.png

 

Now it's time to use your connected Lakehouse.

vhuijieymsft_2-1720747527702.png

vhuijieymsft_3-1720747527715.png

 

If you have any other questions please feel free to contact me.

 

Best Regards,
Yang
Community Support Team

 

If there is any post helps, then please consider Accept it as the solution  to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!

So I am using mendelian architecture to file movement from the bronze lakehouse layer to the warehouse 

So I do know that its possible with pipeline.

but I want to know the solution from the notebook

where I am trying to move raw selected files to warehouse delta

So whenever new incremental load happens the new data will gett append to the warehouse table on the incremental load bases.

if you have more optimized or better way to move data from lakehouse to warehouse using medallion architecture please let me know 

I am loading data from on premises database to lakehouse it will be a full load (parquet) storing as a database/year/month/date/table name

now also getting incremental data from same database on daily basis.

now I want to add these incremental load to silver warehouse

please provide good way do I can run trigger daily and update my silver warehouse layer

thanks,
Red

You could perhaps use Dataflows Gen2 with Append method.

Pattern to incrementally amass data with Dataflow Gen2 - Microsoft Fabric | Microsoft Learn

This should work similarly both for Lakehouse or Warehouse.

 

Or just use a cross-database query to get the Lakehouse data into your Warehouse, as shown in the previous comment (however it doesn't work for Lakehouse files, only Lakehouse tables).

 

 

I don't think Notebook can write data to Warehouse. So I don't think what you are asking is directly possible.
I think you will need to use some other tool than Notebook (Dataflows Gen2, T-SQL script or stored procedure, data pipeline, etc.).
Or you can use Notebook to enter data into a Lakehouse table, and then use cross-database queries to make the data available in the Warehouse.

 

If you want to use Notebook to load data, why not just use Lakehouse instead of Warehouse?  I am curious.

 

If someone knows how to use Notebook to write data to Warehouse, I am interested to know.

Right now we can't write to a Warehouse without going through the Warehouse SQL Endpoint (we can read via Spark as you know).  So the only way to work with data is using a connectivit method like pyodbc and connecting to the Warehouse SQL Endpoint in the Notebook.

 

I'll dig up an example I did a whle ago to show this.

 

Connectivity to data warehousing - Microsoft Fabric | Microsoft Learn

Helpful resources

Announcements
Europe Fabric Conference

Europe’s largest Microsoft Fabric Community Conference

Join the community in Stockholm for expert Microsoft Fabric learning including a very exciting keynote from Arun Ulag, Corporate Vice President, Azure Data.

AugFabric_Carousel

Fabric Monthly Update - August 2024

Check out the August 2024 Fabric update to learn about new features.

September Hackathon Carousel

Microsoft Fabric & AI Learning Hackathon

Learn from experts, get hands-on experience, and win awesome prizes.

Sept NL Carousel

Fabric Community Update - September 2024

Find out what's new and trending in the Fabric Community.