Announcement Announcement Module
Collapse
No announcement yet.
problem with accumulated values from previous execution Page Title Module
Move Remove Collapse
X
Conversation Detail Module
Collapse
  • Filter
  • Time
  • Show
Clear All
new posts

  • problem with accumulated values from previous execution

    Hi
    I have a job with 2 steps. The first one reads from the input, validates and writes to a table PERSONA.
    The second step does a SELECT from the PERSONA table, to obtain the number of records inserted on step one and the sum of another column to check the total value. This is written to a summary table called RESUMEN_PERSONA.
    All the process is just the same to the footbal example. Only the tables and objects changed.

    The problem is that when I execute the first time it says that 5 records were inserted to the PERSONA table. I manually delete these 5 records and try again. The next time it says 10. the next 15 .. etc. In other words it is accumulating the totals from the previous executions. How do I clear the previous execution values? I am deleting all records manually from both tables after each execution. On the football example the tables are dropped and recreated on each execution.

    the step is this one:

    <bean id="step2" parent="simpleStep">
    <property name="commitInterval" value="1" />
    <property name="itemReader"
    ref="resumenPersonasSource" />
    <property name="itemWriter">
    <bean
    class="sbif.mrdsk.dao.JdbcResumenPersonaDao">
    <property name="dataSource"
    ref="dataSource" />
    </bean>
    </property>
    </bean>


    <bean id="resumenPersonasSource"
    class="org.springframework.batch.io.cursor.JdbcCur sorItemReader">
    <property name="dataSource" ref="dataSource" />
    <property name="mapper">
    <bean
    class="sbif.mrdsk.mapping.ResumenPersonaMapper" />
    </property>
    <property name="sql">
    <value>
    SELECT count(id),sum(edad) from persona
    </value>
    </property>
    </bean>

    and this is the DAO

    public class JdbcResumenPersonaDao extends JdbcDaoSupport implements ItemWriter {

    private static final String INSERT_RESUMEN = "INSERT into RESUMEN_PERSONA(ID,CUENTA,SUMA_EDAD,FECHA) values(RESUMEN_PERSONA_SEQ.NEXTVAL,?,?,Current_Tim estamp)";

    public void write(Object output) {

    Assert.isInstanceOf(ResumenPersona.class, output,
    JdbcResumenPersonaDao.class.getName() + " solo "
    + "soporta salida de instancias "
    + ResumenPersona.class.getName());

    ResumenPersona resumen = (ResumenPersona) output;

    Object[] args = new Object[] { Integer.valueOf(resumen.getCuenta()),
    Integer.valueOf(resumen.getSumaEdad()) };

    getJdbcTemplate().update(INSERT_RESUMEN, args);
    }

    public void close() throws Exception {
    }

    public void clear() throws ClearFailedException {
    }

    public void flush() throws FlushFailedException {
    }

    }

  • #2
    How are you deleting the existing rows? Are you sure you committed that transaction?

    (Please use [code][/code] tags to post code and stack traces.)

    Comment


    • #3
      There are multiple ways of handling this. One is the tried and true 'schedule date' method. In this example, you would run the job once a day, for a particular schedule date. The records you insert in the PERSONA table would then have a column like schedule date, it could even be something like 'INSERTION_DT' You could then filter your summarization query on this column. Another method would be putting some type of status column in your table and flipping it when you've summarizaed it. Something like 'STATUS' that is 'NEW' when you first insert it and 'EXISTING' after. Either way, it will have to be handled in your data itself, unless you can remove it as Dave suggested, which you might not be able to if the data itself is needed elsewhere, otherwise there's no way to know what is new and what isn't.

      Comment


      • #4
        Originally posted by Dave Syer View Post
        How are you deleting the existing rows? Are you sure you committed that transaction?

        (Please use [code][/code] tags to post code and stack traces.)
        I am just doing some tests so I was deleting the records manually on the database (Oracle) console after the batch processing finished. Then I ran the process again.. again only 5 records validated and inserted on PERSONA (this is correct). but the totals selected on step 2 are being added to the totals of the previous execution. Let's say.. the total number of records is 5.. now is 10. the sum of the other column was 145 now is 290. It's like something not being cleared after each execution.

        Comment


        • #5
          Originally posted by lucasward View Post
          There are multiple ways of handling this. One is the tried and true 'schedule date' method. In this example, you would run the job once a day, for a particular schedule date. The records you insert in the PERSONA table would then have a column like schedule date, it could even be something like 'INSERTION_DT' You could then filter your summarization query on this column. Another method would be putting some type of status column in your table and flipping it when you've summarizaed it. Something like 'STATUS' that is 'NEW' when you first insert it and 'EXISTING' after. Either way, it will have to be handled in your data itself, unless you can remove it as Dave suggested, which you might not be able to if the data itself is needed elsewhere, otherwise there's no way to know what is new and what isn't.
          I am deleting all the records from both tables (PERSONA and SUMMARY_PERSONA) after each execution. It's just for testing purposes. So I expect that each time I execute the batch process , it should be like if it was the first time I execute it. I am deleting the records directly on the database console.

          Comment


          • #6
            I don't see how this could be something spring batch is doing since it's just pulling from the table. Are you sure you're committing your delete? Is the data really gone in the console?

            Comment

            Working...
            X