Announcement Announcement Module
Collapse
No announcement yet.
MongoDB: No exception when saving dup to @Indexed(unique=true)? Page Title Module
Move Remove Collapse
X
Conversation Detail Module
Collapse
  • Filter
  • Time
  • Show
Clear All
new posts

  • MongoDB: No exception when saving dup to @Indexed(unique=true)?

    Hi,

    I have an object with a field @Indexed(unique = true) annotation.

    Is it by design that when I try to save two objects with the same value for the field, the second save gets ignored without any exception, while saving via mongodb shell a "E11000 duplicate key error index" is prompted?

    Thanks for any input.

  • #2
    I also see the same issue. For me the object gets saved and then I cannot do any more operations due to the "E11000" error. Any others see the same issue? Or could you fix this?

    Hari Gangadharan

    Comment


    • #3
      What version are you using? We had a bug filed against M1 but that was resolved for M2: https://jira.springsource.org/browse/DATADOC-70

      Comment


      • #4
        I am using M2 - I did more tests. The record seems to be saved - it even generates a new Object Id. No exception is thrown. But the record is really not saved.

        Any help is greatly appreciated.

        Hari Gangadharan

        Comment


        • #5
          My Test Output

          Here is my test:
          Code:
              
              @Test
              public void addDups() {
                  log.info("Test Add Dups.....");
                  PublicFeed hariFeed = repository.findByLinkId("some-id-for");
                  assertNotNull(hariFeed);
                  log.info("Hari Feed " + hariFeed);
                  ObjectId origId = harisFeed.getId();
                  harisFeed.setId(null);
                  repository.save(harisFeed);
                  log.info("Id: " + harisFeed.getId() + " Orig Id: " + origId);
                  log.info("Test Add Dups - Done");
              }
          And the output:
          Code:
          22:20:09,331  INFO .feed.api.test.PublicFeedIntegrationTest:  65 - Test Add Dups.....
          22:20:09,332 DEBUG ent.mongodb.repository.MongoQueryCreator: 114 - Created query { "linkId" : "some-id-for"}
          22:20:09,332 DEBUG work.data.document.mongodb.MongoTemplate:1004 - find using query: { "linkId" : "some-id-for"} fields: null for class: class spot.feed.api.model.PublicFeed
          22:20:09,333  INFO .feed.api.test.PublicFeedIntegrationTest:  68 - Hari Feed PublicFeed{id=4dcb6e092d3eef627f9b19a1, title=Hari's Feed, description=Follow Hari on his world trip, usage=null, linkId=some-id-for, daysRange=14, devices=[], features=[]}
          22:20:09,334 DEBUG work.data.document.mongodb.MongoTemplate: 792 - save DBObject containing fields: [linkId, title, description, features, devices, daysRange]
          22:20:09,335  INFO .feed.api.test.PublicFeedIntegrationTest:  72 - Id: 4dcb6e092d3eef62809b19a1 Orig Id: 4dcb6e092d3eef627f9b19a1
          22:20:09,335  INFO .feed.api.test.PublicFeedIntegrationTest:  73 - Test Add Dups - Done
          What I see in database after this insert:
          Code:
          > db.publicfeed.find()      
          { "_id" : ObjectId("4dcb6e092d3eef627e9b19a1"), "linkId" : "some-id-other", "title" : "Other Feed", "description" : "Follow Other world trip", "features" : [ ], "devices" : [ ], "daysRange" : 14 }
          { "_id" : ObjectId("4dcb6e092d3eef627f9b19a1"), "linkId" : "some-id-for", "title" : "Hari's Feed", "description" : "Follow Hari on his world trip", "features" : [ ], "devices" : [ ], "daysRange" : 14 }
          Here is the index (I have tried all the combinations - dropDups = false and true, sparse = false and true, etc - all had same results):
          Code:
          > db.publicfeed.getIndexes()
          [
          	{
          		"name" : "_id_",
          		"ns" : "feed-db.publicfeed",
          		"key" : {
          			"_id" : 1
          		}
          	},
          	{
          		"name" : "linkId_1",
          		"ns" : "feed-db.publicfeed",
          		"dropDups" : false,
          		"sparse" : false,
          		"unique" : true,
          		"key" : {
          			"linkId" : 1
          		}
          	},
          	{
          		"name" : "findByLinkId",
          		"ns" : "feed-db.publicfeed",
          		"key" : {
          			"linkId" : -1
          		}
          	}
          ]
          I am also a not very happy that the repositories scan created another index. Shouldn't it check for an index already created?

          Comment


          • #6
            I am a bit puzzled by your example: the test case shows hariFeed as well as harisFeed. Is this a typo or intended? Comparing your indexes to you database content seems to make sense as well. You have a unique=true on linkId, which is fine as one document is having linkId set to "some-id-for" while the other has it set to "some-id-other". Unfortunately I cannot see the linkId manipulated in your test case (that's the puzzling part). ObjectIds of the documents in the database differ as well.

            Beyond that, make sure you've understood the dropDupes option correctly. This option drops duplicates for documents *when the index is built*. It won't affect any further storage operations (at least from what I've read here [1]).

            So maybe we start by rewinding to what you would expect to see and come up with an even more stripped down example?

            Regarding the additional index we should probably check for an existing index regarding the keys it's covering before creating a new one (so far I actually assumed the call to ensureIndex(…) would cover that). Could you please open up a JIRA for that?

            Cheers and thanks for the effort you put in this

            [1] http://www.mongodb.org/display/DOCS/...-UniqueIndexes

            Comment


            • #7
              Sorry I did not explain my test properly. In my test setup I am saving two objects one with linkId: "some-id-for" and another with linkId: "some-id-other". In the current test I am retrieving the record with "some-id-for" and saving it again (I was even making the Id null since I had a doubt that it is updating when I have the original Id). Now I log both the records... See the id for the record just inserted and the original id. When I do a find on the collection in mongodb, the object with the new id is missing; the one with the original id is still there.

              As for the sparse and dropdup... I read the doc; I was just trying (with not much hope) that I may get a different outcome if I change that.

              Here is my full test (updated):
              Code:
                  @Before
                  @Override
                  public void setUp() {
                      List<PublicFeed> allFeeds = repository.findAll();
                      for (PublicFeed feed: allFeeds) {
                          repository.delete(feed);
                      }
                      PublicFeed harisFeed = new PublicFeed("Hari's Feed", "Follow Hari on his world trip", "some-id-for", 14);
                      PublicFeed someFeed = new PublicFeed("Other Feed", "Follow Other world trip", "some-id-other", 14);
                      repository.save(someFeed);
                      repository.save(harisFeed);
                  }
                  
                  @Test
                  public void addDups() {
                      log.info("Test Add Dups.....");
                      PublicFeed harisFeed = repository.findByLinkId("some-id-for");
                      log.info("Haris Feed " + harisFeed);
                      harisFeed.setId(null);
                      repository.save(harisFeed);
                      log.info("Haris Feed " + harisFeed);
                      log.info("Test Add Dups - Done");
                  }
              And this is the log:

              Code:
              01:05:33,699 DEBUG work.data.document.mongodb.MongoTemplate: 792 - save DBObject containing fields: [linkId, title, description, features, devices, daysRange]
              01:05:33,701 DEBUG work.data.document.mongodb.MongoTemplate: 792 - save DBObject containing fields: [linkId, title, description, features, devices, daysRange]
              01:05:33,702  INFO .feed.api.test.PublicFeedIntegrationTest:  61 - Test Add Dups.....
              01:05:33,703 DEBUG ent.mongodb.repository.MongoQueryCreator: 114 - Created query { "linkId" : "some-id-for"}
              01:05:33,704 DEBUG work.data.document.mongodb.MongoTemplate:1004 - find using query: { "linkId" : "some-id-for"} fields: null for class: class spot.feed.api.model.PublicFeed
              01:05:33,706  INFO .feed.api.test.PublicFeedIntegrationTest:  63 - Haris Feed PublicFeed{id=4dcb94cde582ef62083c7512, title=Hari's Feed, description=Follow Hari on his world trip, usage=null, linkId=some-id-for, daysRange=14, devices=[], features=[]}
              01:05:33,708 DEBUG work.data.document.mongodb.MongoTemplate: 792 - save DBObject containing fields: [linkId, title, description, features, devices, daysRange]
              01:05:33,710  INFO .feed.api.test.PublicFeedIntegrationTest:  66 - Haris Feed PublicFeed{id=4dcb94cde582ef62093c7512, title=Hari's Feed, description=Follow Hari on his world trip, usage=null, linkId=some-id-for, daysRange=14, devices=[], features=[]}
              01:05:33,711  INFO .feed.api.test.PublicFeedIntegrationTest:  67 - Test Add Dups - Done
              I will open a JIRA issue soon. Can we get this fixed in the next milestone?

              Comment


              • #8
                I reopened the same issue you referred since the fix was not complete: https://jira.springsource.org/browse/DATADOC-70

                Hari Gangadharan

                Comment


                • #9
                  It seems the additional index created by the repository layer is ruling out the former one and thus the unique constraint is not executed. Could you please try manually dropping the additional index using:

                  Code:
                  mongoTemplate.execute(new DBCallback<Void>() {
                    public Void execute(DBCollection collection) {
                      collection.dropIndex("findByLinkId");
                    }
                  });
                  It seems the additional index is doing more harm than expected so I'll have my hands on it as soon as we have a ticket for this . Please don't re-open tickets marked as closed for an already released version. Feel free to simply open another one which we can then bind to the next release version.

                  Comment


                  • #10
                    Good thinking but I don't think it is the reason. The code you provided seems to be obsolete, so I added this:
                    Code:
                            mongoTemplate.execute("publicfeed",new CollectionCallback<PublicFeed>() {
                                @Override
                                public PublicFeed doInCollection(DBCollection dbc) throws MongoException, DataAccessException {
                                    log.info("Listing Indexes....");
                                    List<DBObject> indexes = dbc.getIndexInfo();
                                    for (DBObject index: indexes) {
                                        log.info("Index: " + index);
                                    }
                                    log.info("Dropping Index....");
                                    dbc.dropIndex("findByLinkId");
                                    log.info("Listing Indexes....");
                                    indexes = dbc.getIndexInfo();
                                    for (DBObject index: indexes) {
                                        log.info("Index: " + index);
                                    }
                                    return null;
                                }
                            });
                    However disappointed to find the exact same result...

                    Code:
                    13:50:29,268  INFO .feed.api.test.PublicFeedIntegrationTest:  69 - Test Add Dups.....
                    13:50:29,268 DEBUG ent.mongodb.repository.MongoQueryCreator: 114 - Created query { "linkId" : "some-id-for"}
                    13:50:29,268 DEBUG work.data.document.mongodb.MongoTemplate:1004 - find using query: { "linkId" : "some-id-for"} fields: null for class: class spot.feed.api.model.PublicFeed
                    13:50:29,270  INFO .feed.api.test.PublicFeedIntegrationTest:  74 - Listing Indexes....
                    13:50:29,271  INFO .feed.api.test.PublicFeedIntegrationTest:  77 - Index: { "name" : "_id_" , "ns" : "feed-db.publicfeed" , "key" : { "_id" : 1}}
                    13:50:29,271  INFO .feed.api.test.PublicFeedIntegrationTest:  77 - Index: { "name" : "linkId_1" , "ns" : "feed-db.publicfeed" , "dropDups" : false , "sparse" : false , "unique" : true , "key" : { "linkId" : 1}}
                    13:50:29,272  INFO .feed.api.test.PublicFeedIntegrationTest:  77 - Index: { "name" : "findByLinkId" , "ns" : "feed-db.publicfeed" , "key" : { "linkId" : -1}}
                    13:50:29,272  INFO .feed.api.test.PublicFeedIntegrationTest:  79 - Dropping Index....
                    13:50:29,274  INFO .feed.api.test.PublicFeedIntegrationTest:  81 - Listing Indexes....
                    13:50:29,275  INFO .feed.api.test.PublicFeedIntegrationTest:  84 - Index: { "name" : "_id_" , "ns" : "feed-db.publicfeed" , "key" : { "_id" : 1}}
                    13:50:29,275  INFO .feed.api.test.PublicFeedIntegrationTest:  84 - Index: { "name" : "linkId_1" , "ns" : "feed-db.publicfeed" , "dropDups" : false , "sparse" : false , "unique" : true , "key" : { "linkId" : 1}}
                    13:50:29,275  INFO .feed.api.test.PublicFeedIntegrationTest:  89 - Haris Feed PublicFeed{id=4dcc481561b7debcfdb4922d, title=Hari's Feed, description=Follow Hari on his world trip, usage=null, linkId=some-id-for, daysRange=14, devices=[], features=[]}
                    13:50:29,276 DEBUG work.data.document.mongodb.MongoTemplate: 792 - save DBObject containing fields: [linkId, title, description, features, devices, daysRange]
                    13:50:29,276  INFO .feed.api.test.PublicFeedIntegrationTest:  92 - Haris Feed PublicFeed{id=4dcc481561b7debcfeb4922d, title=Hari's Feed, description=Follow Hari on his world trip, usage=null, linkId=some-id-for, daysRange=14, devices=[], features=[]}
                    13:50:29,277  INFO .feed.api.test.PublicFeedIntegrationTest:  93 - Test Add Dups - Done

                    Comment


                    • #11
                      Opened a new JIRA issue: https://jira.springsource.org/browse/DATADOC-134

                      Comment


                      • #12
                        I have got the same error i try to modify test from spring data book, and add one more test to try that test throw err DuplicationKeyException.

                        So this is my test:
                        Code:
                          @Test(expected = DuplicateKeyException.class)
                            public void preventsDuplicateEmail() {
                        
                                Customer dave = repository.findByEmailAddress(new EmailAddress("[email protected]"));
                        
                                Customer anotherDave = new Customer("Dave", "Matthews");
                                anotherDave.setEmailAddress(dave.getEmailAddress());
                        
                                repository.save(anotherDave);
                            }
                        
                            @Test(expected = DuplicateKeyException.class)
                            public void preventsDuplicateEmail2() {
                                Customer customer = new Customer("Oliver", "Gierke");
                                customer.setEmailAddress(new EmailAddress("[email protected]"));
                        
                                Customer customer2 = new Customer("Oliver2", "Gierke2");
                                customer.setEmailAddress(new EmailAddress("[email protected]"));
                        
                                operations.insert(customer);
                                operations.insert(customer2);
                        
                            }
                        As you can see the secondary test try in one test insert two different customer whose have got the same email address.
                        First test is passed but secondary preventsDuplicateEmail2 NOT

                        When i check result in mongo shell this is the result:

                        Code:
                        {
                            "_id" : ObjectId("52cd03bfe4b0fec628febdf5"),
                            "_class" : "com.oreilly.springdata.mongodb.core.Customer",
                            "firstname" : "Oliver",
                            "lastname" : "Gierke",
                            "email" : "[email protected]",
                            "addresses" : [ ]
                        }
                        {
                            "_id" : ObjectId("52cd03bfe4b0fec628febdf6"),
                            "_class" : "com.oreilly.springdata.mongodb.core.Customer",
                            "firstname" : "Oliver2",
                            "lastname" : "Gierke2",
                            "addresses" : [ ]
                        }
                        First document have got email but secondary document havent got :/

                        This is my code:
                        http://www.speedyshare.com/Aun5h/mongodb.zip
                        Last edited by lukasw44; Jan 8th, 2014, 03:23 AM.

                        Comment

                        Working...
                        X