Stories
Slash Boxes
Comments
NOTE: use Perl; is on undef hiatus. You can read content, but you can't post it. More info will be forthcoming forthcomingly.

All the Perl that's Practical to Extract and Report

The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
 Full
 Abbreviated
 Hidden
More | Login | Reply
Loading... please wait.
  • So what do I do?

    Er... can't you split your 10000 records in, say, 10 chunks of 1000 records each, and insert them separately?

    I've seen people having the very same problem and solving it that way...

    Or am I missing something?
  • PLAN A - Do what is natural in the business logc. Many a bulk file load does not have transactional nature at the business-logic level -- any one record could be cancelled separately. And sometime the whole file is a transaction to the business too. Some of my feeds it is required we reject the whole file if any record rejects, but others it is required we accept the rest of the records; I would claim the natural "transaction scope" for loading those files is different.

    However, business logic / natural

    --
    Bill
    # I had a sig when sigs were cool
    use Sig;
  • Let's see, you have a transaction of a mere 10k rows, and the DBA comes complaining to you? Sound to me the DBA has a problem, not you. It should be you going to him saying that you try to insert 10k rows, and the database is really slow - asking him whether his database is badly configured.
    • Thanks. I think in addition to the plans n1vux offered, I needed that validation to know I'm not being completely unreasonable.

      I did do some checking after my post and there are some 40K-record files I am loading. But that's not even a difference of an order of magnitude.

      --
      J. David works really hard, has a passion for writing good software, and knows many of the world's best Perl programmers