Entity Framework (EF), as of v. 6, still has no method for dealing with bulk inserts. If you have, for instance, 750,000 records that need inserting then you will end up sending 750,000 insert statements to the database. In this instance you may also find yourself creating an enourmous object graph, too, so your application may not even get to the point of performing the required inserts without running out of memory.
Here’s a simple workaround. Take a copy of the part of your object model that’s going to cause the problem and then null it out. Perform the rest of the database insert using EF 6 and grab whatever identities you may need for the remaining objects’ foreign keys. Then perform a Bulk SQL Insert of the remaining data.
Here’s a cut-down entity relationship diagram for an example object hierarchy, in this case an investment fund that is a wrap product for further funds (a fund of funds). In the wrap there are multiple years in the forecast and in each year the percentage of the total sum invested varies across the funds. A single configuration contains multiples of these wraps. What’s important here is the hierarchy can grow to have a great many percentages in the final table and thus a great many inserts.
The solution I’ve used relies on getting the DbContext from EF and using the underlying database connection to create Commands (plain old SQL) for the intermediary tables (Fund and Year). It then creates a DataTable for the final table, Percent, and uses SqlBulkCopy() with the context’s db connection and the underlying transaction (nice). The whole thing is wrapped in a try catch block that is within the context’s db transaction so any exceptions can cause a rollback.
You can follow along this quite simple workaround in this demo code. Note the use
AutoDetectChangesEnabled = false and
ValidateOnSaveEnabled = false both of which will improve performance without any other issues.