Announcement Announcement Module
Collapse
No announcement yet.
"Normalizing" csv properties when reading data? Page Title Module
Move Remove Collapse
X
Conversation Detail Module
Collapse
  • Filter
  • Time
  • Show
Clear All
new posts

  • "Normalizing" csv properties when reading data?

    I'm reading in a CSV file that is a big denormalized file.

    It has fields like:

    First name, phone number 1, phone number 2, ..., mailing address line 1, mailing address line 2, ... business address line 1, business address line 2... etc

    In my domain model, I want to have it somewhat more normalized so it looks like this:

    Foo {
    List<PhoneNumber> phoneNumbers;
    Address mailingAddress;
    Address businessAddress;
    ...
    }

    (I say normalizing because I'm reading these into hibernate entities and when it gets saved out some of the properties will be @Embedded in the same table as foo, and some of the lists will be split off into another table.)

    Is there an easy way to build a FieldSetMapper to handle this kind of a thing? Anyone come up with an elegant way to handle something like this yet?

    To put another wrinkle in it, the denormalized CSV file that I'm reading in actually contains information on two different types of entities, and based on one of the fields it will spit out a different entity - but I think I can just make a step in the batch that looks at that one field and then uses the conditional flow to go to the step to do the reading.

  • #2
    What about, reding them in a "Flat Object" and then in the Item Processor transform them into normalized Object (Hibernate Object) and in the item writer hibernate make its magic.

    Comment

    Working...
    X