Announcement Announcement Module
Collapse
No announcement yet.
Incorrect index creation with custom mapping converter and multiple Mongo dbs Page Title Module
Move Remove Collapse
X
Conversation Detail Module
Collapse
  • Filter
  • Time
  • Show
Clear All
new posts

  • Incorrect index creation with custom mapping converter and multiple Mongo dbs

    Hi,
    I am having trouble with index creation using multiple Mongo databases in Spring Data MongoDB 1.0.3.RELEASE. The attached Maven project demonstrates the problem.

    There is a bean called Foo and a repository FooRepository. Foo has an @Indexed property prop.

    I want my Foo instances to be stored in a database called mongo-index-db1. There are other repositories stored in another database mongo-index-db2 which require a custom mapping converter. I have created separate mongo templates for the two databases and passed them into the <mongo:repositories /> and <mongo:mapping-converter /> tags respectively. Unfortunately, my Foo indexes are created in mongo-index-db2:

    Code:
    Mon Jul 30 16:26:04 [initandlisten] connection accepted from 127.0.0.1:60860 #402
    Mon Jul 30 16:26:04 [conn402] build index mongo-index-db2.foo { _id: 1 }
    Mon Jul 30 16:26:04 [conn402] build index done 0 records 0.002 secs
    Mon Jul 30 16:26:04 [conn402] info: creating collection mongo-index-db2.foo on add index
    Mon Jul 30 16:26:04 [conn402] build index mongo-index-db2.foo { prop: 1 }
    Mon Jul 30 16:26:04 [conn402] build index done 0 records 0.002 secs
    Mon Jul 30 16:26:05 [conn402] end connection 127.0.0.1:60860
    This appears to be due to the indexCreationHelper bean created in the MappingMongoConverterParser class receiving the MappingContextEvents for each mongo template. If I comment out the following lines:
    Code:
    		try {
    			registry.getBeanDefinition(INDEX_HELPER);
    		} catch (NoSuchBeanDefinitionException ignored) {
    			if (!StringUtils.hasText(dbFactoryRef)) {
    				dbFactoryRef = DB_FACTORY;
    			}
    			BeanDefinitionBuilder indexHelperBuilder = BeanDefinitionBuilder
    					.genericBeanDefinition(MongoPersistentEntityIndexCreator.class);
    			indexHelperBuilder.addConstructorArgValue(new RuntimeBeanReference(ctxRef));
    			indexHelperBuilder.addConstructorArgValue(new RuntimeBeanReference(dbFactoryRef));
    			registry.registerBeanDefinition(INDEX_HELPER, indexHelperBuilder.getBeanDefinition());
    		}
    then indexes get created everywhere:
    Code:
    Mon Jul 30 16:36:10 [initandlisten] connection accepted from 127.0.0.1:61141 #406
    Mon Jul 30 16:36:10 [conn406] build index mongo-index-db1.foo { _id: 1 }
    Mon Jul 30 16:36:10 [conn406] build index done 0 records 0.001 secs
    Mon Jul 30 16:36:10 [conn406] info: creating collection mongo-index-db1.foo on add index
    Mon Jul 30 16:36:10 [conn406] build index mongo-index-db1.foo { prop: 1 }
    Mon Jul 30 16:36:10 [conn406] build index done 0 records 0.001 secs
    Mon Jul 30 16:36:11 [initandlisten] connection accepted from 127.0.0.1:61142 #407
    Mon Jul 30 16:36:11 [conn407] build index mongo-index-db2.foo { _id: 1 }
    Mon Jul 30 16:36:11 [conn407] build index done 0 records 0.001 secs
    Mon Jul 30 16:36:11 [conn407] info: creating collection mongo-index-db2.foo on add index
    Mon Jul 30 16:36:11 [conn407] build index mongo-index-db2.foo { prop: 1 }
    Mon Jul 30 16:36:11 [conn407] build index done 0 records 0.001 secs
    Mon Jul 30 16:36:11 [conn407] end connection 127.0.0.1:61142
    Mon Jul 30 16:36:11 [conn406] end connection 127.0.0.1:61141
    which is... not great but better. Everything seems to work with the indexCreationHelper code removed but I'm not really sure what it's doing so something could be going subtly wrong somewhere.

    Is there another way I should be going about this to avoid this problem?

    Thanks.

  • #2
    There were two things to the problem. One was a bug (or a missing feature more or less) and a slight misconfiguration of your code.

    I've created DATAMONGO-500 [0] and DATACMNS-209 [1] to allow dismissing events emitted by MappingContexts not connected to the MongoPersistentEntityIndexCreator. So using the snapshots you have to adapt your config to something like this:

    Code:
    <?xml version="1.0" encoding="UTF-8"?>
    <beans …>
        
        <mongo:mapping-converter id="mongoConverter" db-factory-ref="factory2" base-package="$your.base.package" />
        
        <bean id="mongo1" class="org.springframework.data.mongodb.core.MongoTemplate">
        	<constructor-arg name="mongoDbFactory" ref="factory1" />
        </bean>
    
        <bean id="mongo2" class="org.springframework.data.mongodb.core.MongoTemplate">
        	<constructor-arg ref="factory2" />
            <constructor-arg ref="mongoConverter" />
        </bean>
    
        <mongo:db-factory id="factory1" host="127.0.0.1" dbname="mongo-index-db1" />	
        <mongo:db-factory id="factory2" host="127.0.0.1" dbname="mongo-index-db2" />
    
    </beans>
    The key point here is that you have to hand the mongoConverter instance into the MongoTemplate as otherwise it will create a new default converter internally. Feel free to try that setup with a recent snapshot build.

    [0] https://jira.springsource.org/browse/DATAMONGO-500
    [1] https://jira.springsource.org/browse/DATACMNS-209

    Comment


    • #3
      Hi Oliver

      I've finally had a chance to test this with 1.0.5.BUILD-SNAPSHOT and the problem remains.

      I also tried it with 1.1.0.RC1, and the prop index wasn't created at all:
      Code:
      mongo-index-db1.foo indexes: [{ "v" : 1 , "key" : { "_id" : 1} , "ns" : "mongo-index-db1.foo" , "name" : "_id_"}]
      mongo-index-db2.foo indexes: []
      Also, my "real" implementation is written in Groovy, which leads to the following StackOverflowError when saving with 1.1:
      Code:
      Caught: java.lang.StackOverflowError
      java.lang.StackOverflowError
      	at org.springframework.data.mongodb.core.convert.CustomConversions.getCustomTarget(CustomConversions.java:288)
      	at org.springframework.data.mongodb.core.convert.CustomConversions.getCustomWriteTarget(CustomConversions.java:235)
      	at org.springframework.data.mongodb.core.convert.MappingMongoConverter.writePropertyInternal(MappingMongoConverter.java:426)
      	at org.springframework.data.mongodb.core.convert.MappingMongoConverter$3.doWithPersistentProperty(MappingMongoConverter.java:373)
      	at org.springframework.data.mongodb.core.convert.MappingMongoConverter$3.doWithPersistentProperty(MappingMongoConverter.java:362)
      	at org.springframework.data.mapping.model.BasicPersistentEntity.doWithProperties(BasicPersistentEntity.java:195)
      	at org.springframework.data.mongodb.core.convert.MappingMongoConverter.writeInternal(MappingMongoConverter.java:362)
      	at org.springframework.data.mongodb.core.convert.MappingMongoConverter.writePropertyInternal(MappingMongoConverter.java:439)
      	at org.springframework.data.mongodb.core.convert.MappingMongoConverter$3.doWithPersistentProperty(MappingMongoConverter.java:373)
      	at org.springframework.data.mongodb.core.convert.MappingMongoConverter$3.doWithPersistentProperty(MappingMongoConverter.java:362)
      	at org.springframework.data.mapping.model.BasicPersistentEntity.doWithProperties(BasicPersistentEntity.java:195)
      etc...
      This error doesn't happen with 1.0.5.BUILD-SNAPSHOT

      My config now looks like this
      Code:
      <beans xmlns="http://www.springframework.org/schema/beans"
      	xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
      	xmlns:mongo="http://www.springframework.org/schema/data/mongo"
      	xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd
      		http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo.xsd">
      
          <mongo:repositories base-package="index.repositories" mongo-template-ref="mongo1" />
          
          <mongo:mapping-converter id="mongoConverter" db-factory-ref="factory2" base-package="index.beans">
      	<!-- custom mapping converters here -->
          </mongo:mapping-converter>
          
          <bean id="mongo1" class="org.springframework.data.mongodb.core.MongoTemplate">
          	<constructor-arg name="mongoDbFactory" ref="factory1" />
          </bean>
      
          <bean id="mongo2" class="org.springframework.data.mongodb.core.MongoTemplate">
          	<constructor-arg name="mongoDbFactory" ref="factory2" />
          	<constructor-arg name="mongoConverter" ref="mongoConverter" />
          </bean>
      
      	<mongo:db-factory id="factory1" host="127.0.0.1" dbname="mongo-index-db1" />
      	
      	<mongo:db-factory id="factory2" host="127.0.0.1" dbname="mongo-index-db2" />
      </beans>

      Comment


      • #4
        Thanks for your patience, Andy. After a round of debugging I discovered a missing super.… call which caused the initial entity instantiation not being triggered at all. If created and fixed a ticket [0] for this and your sample project is now working as expected for me.

        Is there a chance you can add more context to the StackOverflowError you see? I haven't really seen anyone using SD MongoDB with groovy so I'd like to get more insight regarding this issue.

        [0] https://jira.springsource.org/browse/DATAMONGO-530

        Comment


        • #5
          Thanks for looking into this. I've attached an example which works fine with 1.0 and blows up with 1.1. Looking at some of the places the stack overflow happens it looks like it might be hash code related.

          Comment


          • #6
            The issue seems to be related to a few fields introduced by the Groovy compiler that keep a reference to the actual object. This in turn creates a bidirectional dependency which causes the stack overflow. We'll essentially have to ignore those fields. I'll have a chat with Guillaume for details. Would you mind creating a JIRA issue in the mean time.

            PS: is the plain Java project now working for you?

            Comment


            • #7
              I've opened an issue at DATAMONGO-531

              I've just tried the latest git version with the Java implementation and the index is still not created:

              Code:
              @Document
              public class Foo {
              
              	public Foo(String id, String prop) {
              		this.id = id;
              		this.prop = prop;
              	}
              
              	@Id
              	String id;
              	
              	@Indexed
              	String prop;
              }
              Code:
              mongo-index-db1.foo indexes: [{ "v" : 1 , "key" : { "_id" : 1} , "ns" : "mongo-index-db1.foo" , "name" : "_id_"}]
              mongo-index-db2.foo indexes: []

              Comment


              • #8
                Note that I had to change the sample code to add a base-package attribute to the converter declaration. If that's missing, the entity will not get added on application context startup and the index not created. It would eventually, but only if you invoke a persistence operation on an instance of it (which your Runner doesn't do). So make sure you have that attribute added or alter the Runner accordingly. I could reproduce the correct behavior with your sample project once, the setup was fixed.

                Comment


                • #9
                  I have too many copies of this code

                  You're right, I was missing the base-package property on the mapping-converter. I've attached the updated code.

                  However, the problem still remains. The index is created in the mongo db attached to the converter, not the mongo db attached to the repository:

                  Code:
                  mongo-index-db1.foo count: 1
                  mongo-index-db2.foo count: 0
                  mongo-index-db1.foo indexes: [{ "v" : 1 , "key" : { "_id" : 1} , "ns" : "mongo-index-db1.foo" , "name" : "_id_"}]
                  mongo-index-db2.foo indexes: [{ "v" : 1 , "key" : { "_id" : 1} , "ns" : "mongo-index-db2.foo" , "name" : "_id_"}, { "v" : 1 , "key" : { "prop" : 1} , "ns" : "mongo-index-db2.foo" , "name" : "prop" , "dropDups" : false , "sparse" : false}]

                  Comment


                  • #10
                    We were half way there actually. Here's what's correct already.

                    1. The setup of the second database picks up the entity on application bootstrap, triggers our indexing infrastructure and creates the index for the prop property. This transitively kicks off, Mongo's index creation for _id. This essentially creates the output you see for db2.
                    2. You have wired the repositories to mongodb1! This means that the Foo class gets added to the first template's mapping context, get's persisted and thus the id index is created transparently. Still, according to your configuration the index information should actually be equivalent to what you see for db2.

                    So we have two things here: first, you don't see the output you expect as you probably want to wire the second template to the repositories seeing the first database untouched. Second, if configured like you have it erroneously the first index setup should actually look identical for both databases. There's been a minor glitch in application context reconfiguration inside MongoTemplate which cause the index creation for database 1 not being triggered correctly.

                    Long story short: I've created another ticket [0] and fixed it and deployed it. So here is what you should see right now:

                    1. If you leave the configuration as is, you should see indexes created in both database as the lookup of the FooRepository triggers Foo being added to the mapping context of database one. It's actually the repository lookup that adds foo, not the call to save.
                    2. If you remove the repository lookup no indexes should be created in database one.
                    3. If you fix the setup to wire template 2 into the repositories you should only see indexes created in database 2 independent of whether you lookup the repo or not (which is probably what you originally would have liked to see).

                    I'll go on with the Groovy glitch on Monday…

                    [0] https://jira.springsource.org/browse/DATAMONGO-533

                    Comment


                    • #11
                      Ok, I've tried it with that fix applied and I do see collections and indexes created in both dbs, which is fine for all practical purposes.

                      Perhaps I should give a little background on what I'm doing. My "real" application uses a CQRS architecture and so I'm saving command-side collections in one db and query projections in another. Only objects on the command side use custom mappers.

                      In my example app, this corresponds to the query repositories being registered to mongo1 and the mapping converter being registered to mongo2: no Foos will ever be persisted in mongo2. So my expectation is to see no foo collection in mongo2 at all. A more representative example would have Bar beans being mapped and persisted in mongo2.

                      I hope this makes what I'm doing clearer.

                      Comment


                      • #12
                        The Groovy issue [0] is fixed in Spring Data Commons. So with the latest versions in the classpath I get your Groovy code to work.

                        [0] https://jira.springsource.org/browse/DATACMNS-228

                        Comment


                        • #13
                          Code:
                          Saving a Foo
                          Saved
                          Fantastic! Thanks, Oliver.

                          Comment


                          • #14
                            Thanks for your patience!

                            Comment

                            Working...
                            X