Hi all,
I have read on here and several other articles online that having triggers on a table before insert to get the sequence.nextval is a performance hit versus just having it put into an insert statement in the app code. Most of these articles are about 10 years old or older and I was wondering if that same rule applies today or is it a myth. Reason I ask is we have a pretty archaic application where our code base is very complex and the the people who wrote it are no longer here. We are trying to perform some tasks here and it was brought up up that we need to add an ID column to a table "events" which every customer schema has. Our table can vary in each schema from about 500K rows to upwards of 500million rows. Some of the DBA's are against the idea of the trigger, however a few are not against if and claim thats the old mentality and way of thinking.
If we setup a trigger where all it would do is BEFORE INSERT get the next value and insert it and this trigger would fire off thousands and thousands of times a day for each customer, would there really be a negative hit on the DB?
I plan to setup a test env where I can see the effects but I wanted to ask it here.
We are running 11.2.0.4 Oracle 64bit on Solaris.