Announcement Announcement Module
Collapse
No announcement yet.
Postgre, jdbcTemplate and pring Page Title Module
Move Remove Collapse
X
Conversation Detail Module
Collapse
  • Filter
  • Time
  • Show
Clear All
new posts

  • Postgre, jdbcTemplate and pring

    Hi,

    in our company we use Spring Core (4.0.4), JDBCTemplate and postgres 9.3.

    When I try this example:
    http://docs.spring.io/spring/docs/4....meter-handling

    I have some problems with large files (> 200MB)
    I adapted the example for my scenario:


    Code:
        public void addToDB(final long documentId, final File file) throws IOException {
            String sql = "insert into filetest (id, content) values (?,?)";
            final InputStream blobIs = new FileInputStream(file);
            DefaultLobHandler defaultLobHandler = new DefaultLobHandler();
            jdbcTemplate
                    .execute(sql, new AbstractLobCreatingPreparedStatementCallback(defaultLobHandler) {
    
                        @Override
                        protected void setValues(PreparedStatement ps, LobCreator lobCreator) throws SQLException {
                            ps.setLong(1, documentId);
                            lobCreator
                                    .setBlobAsBinaryStream(ps, 2, blobIs, (int) file
                                            .length());
                        }
                    });
            blobIs.close();
        }
    on postgre i created the table:

    Code:
    CREATE TABLE filetest
    (
      id bigint,
      content bytea
    )
    WITH (
      OIDS=FALSE
    );
    ALTER TABLE filetest
      OWNER TO postgres;
    ALTER TABLE filetest ALTER COLUMN content SET STORAGE EXTERNAL;
    
    
    ‚Äč
    And I have my test that call the repository with the above method.
    For smal files the code works (with pgAdmin client, I can connect and i see the two records).

    When I try to load large files (more than 200 MB ) i can save the file (the implementation did not give me errors), but when i try to read in from the pgAdmin then it abort the operation. Only when I select the id record, I don't have problems to see it (only the content record abort the search). I think this can occurs why the record is corrupted (how can i check this?) or why the GUI-Client can't load the record (i tryed it from console, and also there i have strange behavior)

    Now the question are:
    -is there any parameter of postgre that can limit the size? (bytea can handle 1GB)
    -is there any check that i can make that the data are written to DB?

    Becouse when I try to read (programmaticaly) from the record I become a Java Heap Speace exception......


  • #2
    Please, move your question to the StackOverflow - we are going to close this forum soon and rely on SO.
    We need to clean this forum (old unanswered question) before close it.

    Thanks for understanding

    Comment

    Working...
    X