The ultimate Microsoft Fabric, Power BI, Azure AI, and SQL learning event: Join us in Stockholm, September 24-27, 2024.
Save €200 with code MSCUST on top of early bird pricing!
Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started
Hi,
I am trying to transfer files from lakehouse to warehouse using notebook
how can we do that? i used abfss to read the files from lakehouse and used abfss as path for warehouse but I am getting error
lakehouse files to warehouse tables
Thanks,
Red
Solved! Go to Solution.
You could perhaps use Dataflows Gen2 with Append method.
Pattern to incrementally amass data with Dataflow Gen2 - Microsoft Fabric | Microsoft Learn
This should work similarly both for Lakehouse or Warehouse.
Or just use a cross-database query to get the Lakehouse data into your Warehouse, as shown in the previous comment (however it doesn't work for Lakehouse files, only Lakehouse tables).
I don't think Notebook can write data to Warehouse. So I don't think what you are asking is directly possible.
I think you will need to use some other tool than Notebook (Dataflows Gen2, T-SQL script or stored procedure, data pipeline, etc.).
Or you can use Notebook to enter data into a Lakehouse table, and then use cross-database queries to make the data available in the Warehouse.
If you want to use Notebook to load data, why not just use Lakehouse instead of Warehouse? I am curious.
If someone knows how to use Notebook to write data to Warehouse, I am interested to know.
Right now we can't write to a Warehouse without going through the Warehouse SQL Endpoint (we can read via Spark as you know). So the only way to work with data is using a connectivit method like pyodbc and connecting to the Warehouse SQL Endpoint in the Notebook.
I'll dig up an example I did a whle ago to show this.
Connectivity to data warehousing - Microsoft Fabric | Microsoft Learn
Some good answers here but my question is why do you want to move data from a Lakehouse to a Warehouse?
Would shortcuts or views be an alternative option or are you hoping to use T-SQL permissions?
Hi @reded1 ,
Transferring files from Lakehouse to Warehouse using a notebook can be a little difficult, I have a more convenient solution here.
Once you're in the warehouse in Fabric, select the place I've highlighted.
Select the Lakehouse you want to connect to.
Now it's time to use your connected Lakehouse.
If you have any other questions please feel free to contact me.
Best Regards,
Yang
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
So I am using mendelian architecture to file movement from the bronze lakehouse layer to the warehouse
So I do know that its possible with pipeline.
but I want to know the solution from the notebook
where I am trying to move raw selected files to warehouse delta
So whenever new incremental load happens the new data will gett append to the warehouse table on the incremental load bases.
if you have more optimized or better way to move data from lakehouse to warehouse using medallion architecture please let me know
I am loading data from on premises database to lakehouse it will be a full load (parquet) storing as a database/year/month/date/table name
now also getting incremental data from same database on daily basis.
now I want to add these incremental load to silver warehouse
please provide good way do I can run trigger daily and update my silver warehouse layer
thanks,
Red
You could perhaps use Dataflows Gen2 with Append method.
Pattern to incrementally amass data with Dataflow Gen2 - Microsoft Fabric | Microsoft Learn
This should work similarly both for Lakehouse or Warehouse.
Or just use a cross-database query to get the Lakehouse data into your Warehouse, as shown in the previous comment (however it doesn't work for Lakehouse files, only Lakehouse tables).
I don't think Notebook can write data to Warehouse. So I don't think what you are asking is directly possible.
I think you will need to use some other tool than Notebook (Dataflows Gen2, T-SQL script or stored procedure, data pipeline, etc.).
Or you can use Notebook to enter data into a Lakehouse table, and then use cross-database queries to make the data available in the Warehouse.
If you want to use Notebook to load data, why not just use Lakehouse instead of Warehouse? I am curious.
If someone knows how to use Notebook to write data to Warehouse, I am interested to know.
Right now we can't write to a Warehouse without going through the Warehouse SQL Endpoint (we can read via Spark as you know). So the only way to work with data is using a connectivit method like pyodbc and connecting to the Warehouse SQL Endpoint in the Notebook.
I'll dig up an example I did a whle ago to show this.
Connectivity to data warehousing - Microsoft Fabric | Microsoft Learn
Join the community in Stockholm for expert Microsoft Fabric learning including a very exciting keynote from Arun Ulag, Corporate Vice President, Azure Data.
Check out the August 2024 Fabric update to learn about new features.
Learn from experts, get hands-on experience, and win awesome prizes.
User | Count |
---|---|
22 | |
7 | |
4 | |
4 | |
3 |