I have noticed that when I have a medium to large dataset and my query contains a column with a VARCHAR2 with a high character count (like VARCHAR2(4000)) then the query performance will be much slower when using the .NET provider than when using TOAD or SQL Developer. In fact, it doesn't matter if the column I'm reading from has all null values and also doesn't matter if I don't actually bind the data values.
For example:
var connectionString = "Your connection string here;";
var sql = @"
SELECT cast(null as varchar2(4000))
FROM dual
CONNECT BY rownum <= 10000";
using (var con = new Oracle.ManagedDataAccess.Client.OracleConnection())
{
con.ConnectionString = connectionString;
con.Open();
var start = DateTime.Now;
var i = 0;
using (var cmd = con.CreateCommand())
{
cmd.CommandText = sql;
using (var reader = await cmd.ExecuteReaderAsync())
while (await reader.ReadAsync())
i++;
}
var end = DateTime.Now;
var time = (end - start).TotalSeconds;
("Found " + i + " rows and took " + time + " seconds").Dump();
}
On my database, this returned the following:
Found 10000 rows and took 9.0968015 seconds
However, when I wrap the column in a to_clob like this:
var sql = @"
SELECT to_clob(cast(null as varchar2(4000)))
FROM dual
CONNECT BY rownum <= 10000";
Found 10000 rows and took 0.3094698 seconds
It is 30 times faster with to_clob!
Does anyone know why this is? Is this a bug in the .NET providers? Am I missing something here?