A bulk operation refers to the process of inserting, updating, or deleting a large number of records in a database in a single, optimized command or batch, rather than executing individual statements for each row.

This is critically important for performance in data-intensive applications(e.g. as ETL processes, migrations, or real-time analytics) because it reduces network latency, minimizes transaction log overhead, and can improve throughput by 10x to 100x compared to row-by-row operations.

 

So now that we understand the importance of the topic, let's take a look at ef's support for bulk operations and see if functions like AddRange() and RemoveRange() are considered bulk operations or not.

 

What EF Does Provide (Not True Bulk)

context.Entities.AddRange(entities);     // Multiple INSERTs (one per entity)
context.Entities.RemoveRange(entities);  // Multiple DELETEs
context.SaveChanges();                   // Sends N individual commands
  • Problem: This generates one SQL command per entity.
    • ✔ They only batch state entries
    • ❌ They DO NOT batch SQL.
  • Performance: Poor for 1,000+ records due to round-trips.

 

High-Efficiency Bulk Data Techniques for EF Core

EF Core: ExecuteDeleteAsync, ExecuteUpdateAsync (EF 7+)

True bulk DELETE and UPDATE — no loading entities!

// Bulk DELETE
await context.Products.Where(p => p.Price < 10).ExecuteDeleteAsync();

// Bulk UPDATE
await context.Products.Where(p => p.IsActive)
                      .ExecuteUpdateAsync(s => s.SetProperty(p => p.Price, p => p.Price * 1.1m));
  • Available in EF Core 7.0+
  • Generates single SQL command
  • No tracking, no round-trips

 

limitations:

  • ❌ Cannot use navigation properties.
  • ❌ Cannot use complex expressions.
  • ❌ Cannot modify multiple tables.
  • ❌ ExecuteUpdate can only update columns on that entity (not relations).
  • ❌ No concurrency checks (no RowVersion support).
  • ❌ No triggers unless SQL applies them.

 

Third-Party Libraries (Most Popular)

EFCore.BulkExtensions

  • Insert, Update, Upsert, Delete    
  • Best performance, actively maintained
using EFCore.BulkExtensions;

var entities = new List<Product>(10000);
for (int i = 0; i < 10000; i++)
    entities.Add(new Product { Name = "Item " + i, Price = i });

await context.BulkInsertAsync(entities);
await context.BulkUpdateAsync(updatedEntities);
await context.BulkInsertOrUpdateAsync(upsertEntities); // Merge


// Delete
context.Items.Where(a => a.ItemId > 500).BatchDelete();
context.Items.Where(a => a.ItemId > 500).BatchDeleteAsync();

// Update (using Expression arg.) supports Increment/Decrement
context.Items.Where(a => a.ItemId <= 500).BatchUpdate(a => new Item { Quantity = a.Quantity + 100 });
context.Items.Where(a => a.ItemId <= 500).BatchUpdateAsync(a => new Item { Quantity=a.Quantity+10 });
// can be as value '+100' or as variable '+incrementStep' (int incrementStep = 100;)

// Update (via simple object)
context.Items.Where(a => a.ItemId <= 500).BatchUpdate(new Item { Description = "Updated" });
context.Items.Where(a => a.ItemId <= 500).BatchUpdateAsync(new Item { Description = "Updated" });
// Update (via simple object) - requires additional Argument for setting to Property default value
var updateCols = new List<string> { nameof(Item.Quantity) }; //Update 'Quantity' to default val:'0'
var q = context.Items.Where(a => a.ItemId <= 500);
int affected = q.BatchUpdate(new Item { Description = "Updated" }, updateCols);//result assigned aff.

// Batch iteration (useful in same cases to avoid lock escalation)
do
{
    rowsAffected = query.Take(chunkSize).BatchDelete();
} while (rowsAffected >= chunkSize);

// Truncate
context.Truncate<Entity>();
context.TruncateAsync<Entity>();

 

Z.EntityFramework.Extensions

  • All bulk ops    
  • Paid license for production

 

SqlBulkCopy (Max performance inserts only)

using (SqlConnection connection = new("connectionString"))
{
    connection.Open();
    using var transaction = connection.BeginTransaction();
    using SqlBulkCopy sqlBulkCopy = new(connection, SqlBulkCopyOptions.Default, transaction)
    {
        BatchSize = 1000,//optional
        BulkCopyTimeout = 600, //optional
    };
    try
    {
        DataTable dt = new DataTable();
        dt.Columns.Add("Id", typeof(int));
        dt.Columns.Add("Name", typeof(string));

        for (int i = 0; i < 300000; i++)
        {
            DataRow dr = dt.NewRow();
            dr["Id"] = i + 1;
            dr["Name"] = "Name";
            dt.Rows.Add(dr);
        }

        sqlBulkCopy.DestinationTableName = "[dbo].[People]";
        sqlBulkCopy.WriteToServer(dt);

        transaction.Commit();
    }
    catch
    {
        transaction.Rollback();
    }
}

SqlBulkCopy Only Makes Sense When:

  • Inserting 100+ records
  • Performance is critical
  • You're okay with bypassing EF change tracking
  • You don’t need identity values back (or handle manually)

1) What is the FASTEST way to convert List to DataTable?

Best & Fastest: Use IDataReader (No DataTable needed!)

Avoid DataTable entirely . it’s slow to build. Use a custom IDataReader that streams your list directly.

using System.Data;
using System.Linq.Expressions;
using System.Reflection;


public class ObjectReader<T> : IDataReader
{
    private readonly IEnumerator<T> _enumerator;
    private readonly IList<PropertyInfo> _properties;
    private readonly IList<Func<T, object>> _getters;
    private readonly Dictionary<string, int> _ordinalLookup;
    private bool _disposed;

    public ObjectReader(IEnumerable<T> data)
    {       
        _enumerator = data.GetEnumerator();

        _properties = typeof(T)
            .GetProperties(BindingFlags.Public | BindingFlags.Instance)
            .Where(p => p.CanRead)
            .ToList();

        _getters = _properties
            .Select(CreateGetter)
            .ToList();

        _ordinalLookup = _properties
            .Select((p, i) => new { p.Name, Index = i })
            .ToDictionary(x => x.Name, x => x.Index, StringComparer.OrdinalIgnoreCase);
    }
    // Optimized Getter (Compiled Expression)
    private static Func<T, object> CreateGetter(PropertyInfo prop)
    {
        var instance = Expression.Parameter(typeof(T), "x");
        var property = Expression.Property(instance, prop);

        // Convert value types to object
        var convert = Expression.Convert(property, typeof(object));

        return Expression.Lambda<Func<T, object>>(convert, instance).Compile();
    }

    // ----------------------------------------
    // IDataReader Implementation
    // ----------------------------------------

    public bool Read() => _enumerator.MoveNext();

    public int FieldCount => _properties.Count;

    public string GetName(int i) => _properties[i].Name;

    public int GetOrdinal(string name) => _ordinalLookup[name];

    public object GetValue(int i) => _getters[i](_enumerator.Current);

    public object this[int i] => GetValue(i);

    public object this[string name] => GetValue(GetOrdinal(name));

    public bool IsDBNull(int i)
    {
        var val = GetValue(i);
        return val == null || val == DBNull.Value;
    }

    public Type GetFieldType(int i) => _properties[i].PropertyType;

    public string GetDataTypeName(int i) => GetFieldType(i).Name;

    public int GetValues(object[] values)
    {
        int count = Math.Min(values.Length, FieldCount);
        for (int i = 0; i < count; i++)
            values[i] = GetValue(i);
        return count;
    }

    // ----------------------------------------
    // Strongly-Typed Getters
    // ----------------------------------------
    public bool GetBoolean(int i) => (bool)GetValue(i);
    public byte GetByte(int i) => (byte)GetValue(i);
    public char GetChar(int i) => (char)GetValue(i);
    public DateTime GetDateTime(int i) => (DateTime)GetValue(i);
    public decimal GetDecimal(int i) => (decimal)GetValue(i);
    public double GetDouble(int i) => (double)GetValue(i);
    public float GetFloat(int i) => (float)GetValue(i);
    public Guid GetGuid(int i) => (Guid)GetValue(i);
    public short GetInt16(int i) => (short)GetValue(i);
    public int GetInt32(int i) => (int)GetValue(i);
    public long GetInt64(int i) => (long)GetValue(i);
    public string GetString(int i) => (string)GetValue(i);

    // Not required by SqlBulkCopy
    public IDataReader GetData(int i) => throw new NotSupportedException();
    public long GetBytes(int i, long fieldOffset, byte[] buffer, int bufferoffset, int length) => throw new NotSupportedException();
    public long GetChars(int i, long fieldOffset, char[] buffer, int bufferoffset, int length) => throw new NotSupportedException();

    public DataTable GetSchemaTable() => null; // Simplest valid implementation

    public int Depth => 0;
    public bool IsClosed => false;
    public int RecordsAffected => -1;

    public bool NextResult() => false;
    public void Close() { }

    public void Dispose()
    {
        if (!_disposed)
        {
            _enumerator.Dispose();
            _disposed = true;
        }
    }
}

 

2) Usage: Stream 10,000 records in < 10ms

public async Task BulkInsertAsync<T>(List<T> entities, string tableName)
{
    using var reader = new ObjectReader<T>(entities);

    await using var connection = new SqlConnection(connectionString);
    await connection.OpenAsync();

    using var bulkCopy = new SqlBulkCopy(connection)
    {
        DestinationTableName = tableName,
        EnableStreaming = true
    };

    // Optional: Map columns if names differ
    // bulkCopy.ColumnMappings.Add("Name", "ProductName");

    await bulkCopy.WriteToServerAsync(reader);
}
  • No DataTable → No memory bloat
  • Streams directly from your list
  • 10,000 records → 30–80ms total
  • Zero reflection per row after startup
  • No NuGet needed.
  • Works with any POCO.
  • Supports 10k–1M+ records.
  • await BulkInsertAsync(myListOf10KProducts, "Products");

 

3) WriteToServerAsync is not atomic by itself.

using var bulkCopy = new SqlBulkCopy(connection);
await bulkCopy.WriteToServerAsync(dataTable); // 10,000 rows
  1. Row 1–1,000: Inserted
  2. Row 1,001: Fails (e.g. unique constraint)
  3. Rows 1–1,000 stay in DB
  4. No automatic rollback

 

4) How to Make It Atomic? Wrap in a SqlTransaction

await using var connection = new SqlConnection(connStr);
await connection.OpenAsync();
using var transaction = connection.BeginTransaction();
try
{
    using var bulkCopy = new SqlBulkCopy(connection, SqlBulkCopyOptions.Default, transaction)
    {
        DestinationTableName = "Products"
    };

    await bulkCopy.WriteToServerAsync(dataTable);
    transaction.Commit(); // Only if ALL rows succeed
}
catch
{
    transaction.Rollback(); // All changes undone
    throw;
}

 

 

How To Choose :

In short, I can summarize what has been said in this formula:


Bulk Operations Decision Rule (One Glance)

Record Count

INSERT

UPDATE / DELETE

≤ 20

EF AddRange() / SaveChanges()

EF ExecuteUpdate/Delete (or SaveChanges)

21 – 1,000

EFCore.BulkExtensions

EFCore.BulkExtensions

> 1,000

SqlBulkCopy (fastest)

EFCore.BulkExtensions

 

Why This Works

Range

Reason

≤ 20

EF is faster + simpler; no overhead

21–1,000

BulkExtensions gives good speed + EF integration

> 1,000 (inserts)

SqlBulkCopy is raw fastest (native protocol)

> 1,000 (update/delete)

BulkExtensions uses MERGE/DELETE efficiently — SqlBulkCopy can't help

 

One-Liner Code Logic

if (count <= 20)
    await context.Items.AddRange(items).SaveChangesAsync();
else if (count <= 1000)
    await context.BulkInsertAsync(items); // or BulkUpdate/BulkDelete
else
    await SqlBulkCopyInsert(items); // Custom helper

 

Nested Objects

none of SqlBulkCopy, EFCore.BulkExtensions, or Z.EntityFramework.Extensions support nested objects out of the box. They only work with flat POCOs (plain objects with simple properties: int, string, DateTime, etc.).

But Z.EntityFramework.Extensions has partial support via IncludeGraph, and you can work around nesting in all three with manual flattening or separate bulk inserts.

 

Scenario:  we want to insert  List<Order> ( each has a Customer and List< OrderItem>)

public class Order
{
    public int Id { get; set; }
    public Customer Customer { get; set; }        // Object2
    public List<OrderItem> Items { get; set; }    // List<Object3>
}

public class Customer { ... }
public class OrderItem { ... }

 

What Happens If You Try?

SqlBulkCopy

Fails — can't map Customer or Items to columns

EFCore.BulkExtensions

Inserts only Order fields (Customer and Items ignored)

Z.EntityFramework.Extensions

May work if Customer and OrderItem are EF navigation properties and you use IncludeGraph