I have encountered an issue that is pretty much a duplicate of this person's post .NET Oracle Client Slow to Read Large VARCHAR2 which unfortunately was left unanswered.
To simplify the problem, I made a table with a few columns including one with varchar(4000) as the data type. I created 6000 rows to populate the table and using entity framework core I was seeing it took almost 10 seconds to return the result set when I was executing context.EntityNameHere.ToList()
. At first, I suspected database or response time issues, but at the EF Core logging I see the executed DbCommand was only taking 80-120ms. The generated SQL was also very simple, just select columns names from table name. No where clause, or complexities. If I reduce that varchar(4000) size down to something much smaller, say 40 or so, then result time is much faster. However, I just did that as a test - I need a larger varchar size.
Running the same select statement in SQL Developer or DBeaver yields much faster results, about 10x faster. I am pretty sure it has something to do with the large sized varchar column, but I can't figure out what to do to fix it.
I have tried changing it to varchar2, specifying the type for the entity to explicity say varchar(4000), tried AsNoTracking, used the underlying DB connection and executed as reader and looping through reader the old fashion way. None of which yielded a performance improvement.