Announcement Announcement Module
Collapse
No announcement yet.
Whole-Batch transaction 2 Page Title Module
Move Remove Collapse
X
Conversation Detail Module
Collapse
  • Filter
  • Time
  • Show
Clear All
new posts

  • Whole-Batch transaction 2

    Hi,
    The question has already been aked here, but there seems to be no definitive answer yet.

    In short, the question is if it's possible to have one transaction covering whole job, or, at least step. It's clear that this would make some of the features unusable, but i'm sure, it's still realistic requirement.

    I have quite a simple case where items are read from the file and should be written to the db. The problem is that performance requirements forced us to abandon JPA and even JdbcBatchItemWriter. The reason for the last decision was that items may be either updated or inserted. Currently i'm testing custom writer which generates sql to check if the items it received are in db and then arranges them into ones to be inserted and updated. Once arranged it uses jdbcTemplate.batchUpdate to run queries. Having items grouped by update/insert may help us to use OracleBatch in the future (even more faster).

    The writer seems to be working ok, but in some cases i do really need to commit after whole job is done, but, of course i can't set commit-interval to Integer.MAX.

    I'm thinking about creating non-transaction job and commiting transaction in dedicated tasklet. But will I be able to tweak number of items writer receives ater that?

    One, althouhg quite perverted, solution seems to be to create jdbcTemplate befor transaction syncronization started, for example, in writer's afterPropertiesSet method. Thus it will be out of the steps transaction but items number would still be configurable by commit-interval.

    What do you think?

  • #2
    I think I have a definitive answer: it's not possible.

    I don't understand your use case or the workarounds. Why do you need a single transaction? How do you know it is not going to run out of time and/or memory?

    The usual best practice is to use temp tables and copy the data to its final destination only when complete. Even that can timeout, but is usually more reliable than trying to batch individual inserts and updates into a single transaction.

    Comment


    • #3
      Yes, temp table is an option, though not without side effects.
      Regarding cause of necessity to have single transaction - in my case it comes from having legacy system. I have old import solution used to push everthing in in one transcation. The target system may contain some logics which uses most recent data from a table. And if it's the case such a table can't be updated partly.
      In general i believe single-transaction import would be quite a usefull feature. Whenever there is data integrity requrirement and code can't be changed to cull partly imported data it would be helpful.

      Best regards, Eugene.

      Comment

      Working...
      X