Hibernate : Mistakes to avoid

Hibernate is the most popular and widely used object-relational tool for Java programming. While using hibernate we may run into some unexpected behaviour due to a few things that can easily get missed.

I have listed down some of the mistakes that I made during development and can be easily avoided :

1. Not giving fetch type for to-one side of relationships.

The default fetch type for to-one side of relationship is eager. FetchType.Eager causes hibernate to load all the associated entities as soon as you load the parent entity. This results in executing a lot of unwanted queries. This becomes a huge performance issue in case you have large number of records to be fetched.

So always remember to explicitly define fetchType lazy if you do not want your associated relationships to be loaded at the time of loading parent entity.

2. Not specifying allocation size for SQL generated sequence.

Hibernate provides facility to use custom sql sequence for generating ids for your entities.

@Id
@GeneratedValue(strategy = GenerationType.SEQUENCE, generator="sequence_generator_for_person")
@SequenceGenerator(name = "sequence_generator_for_person", sequenceName ="person_sequence")
@Column(name = "person_id")
private Long id;

In the above case, you must create a person_sequence via SQL. The important factor to note here is specifying allocation size for your sequence. The default allocation size is 50 and it may result in generating unexpected negative values.

If you have defined you sequence next value as 1 in the beginning, The generator takes the next value as 1 and it can take value from a range of -49 to 1 and mostly result in generating a negative value for your sequence. To avoid this problem the increment value of the sequence and the allocationSize must be set to the same value.

3. Not using @JsonManagedReference and @JsonBackReference for bidirectional relationships.

Suppose we have two associated entities as follows :

public class Person {
    public int id;
    public String name;
    public List<Address> addressList;
} 

public class Address {
    public int id;
    public String fullAddress;
    public Person person;
}

When we try to serialise an instance of Address it will throw JsonMappingException. This occurs due to infinite recursion which occurs because Address has instance of Person which in turn has List of Address in it. To avoid this condition, we need to annotate our entities as follows :

public class Person {
    public int id;
    public String name;
    @JsonBackReference
    public List<Address> addressList;
} 

public class Address {
    public int id;
    public String fullAddress;
   @JsonManagedReference
    public Person person;
}

This allows Jackson to better manage the relationship. Here @JsonManagedReference is the forward part of reference which gets serialized normally.

@JsonBackReference is the back part of reference which will not be considered during serialization. Alternatively, we can use @JsonIgnore to tell jackson which property to ignore during serialization.

4. Using cascade type REMOVE or ALL for many-to-many relationships.

It is wrong to use CascadeType.REMOVE or CascadeType.ALL( as it means all operations are to be cascaded, including remove). This is because it will trigger removing all the records associated with a particular instance which is deleted. But the removed child records may have association with other parent instances as well.

5. Defining many-to-many relationships as list

It is a very crucial mistake to model your many-to-many side of relationship to wrong data type. If you declare you association as list, at every operation(insertion/deletion) hibernate internally deletes all the existing records with that association and re inserts a new one for each managed relationship.

@Entity
public class Person {
     
    @ManyToMany
    @JoinTable(name = "person_address",
                joinColumns = { @JoinColumn(name = "fk_person") },
                inverseJoinColumns = { @JoinColumn(name = "fk_address") })
    private List<Address> addressList = new ArrayList<Address>();
}

When I insert or delete a new Address, hibernate deletes all records associated with given Address from person_address table and re-inserts all the records along with new one, which is completely unnecessary.

Hibernate: select person0_.id as id1_1_0_, person0_.title as title2_1_0_, person0_.version as version3_1_0_ from person person0_ where person0_.id=?
Hibernate: select nextval ('hibernate_sequence')
Hibernate: select addressList0_.fk_person as fk_person1_2_0_, addressList0_.fk_address as fk_autho2_2_0_, address1_.id as id1_0_1_, address1_.firstName as firstNam2_0_1_, address1_.lastName as lastName3_0_1_, address1_.version as version4_0_1_ from person_address addressList0_ inner join address address1_ on addressList0_.fk_address=address1_.id where addressList0_.fk_person=?
Hibernate: insert into address (firstName, lastName, version, id) values (?, ?, ?, ?)
Hibernate: update person set title=?, version=? where id=? and version=?
Hibernate: delete from person_address where fk_person=?
Hibernate: insert into person_address (fk_person, fk_address) values (?, ?)
Hibernate: insert into person_address (fk_person, fk_address) values (?, ?)

If I use Set instead of List and try to carry out the same operation, it will just insert the new record in the person_address relationship table.

Hibernate: select person0_.id as id1_1_0_, person0_.title as title2_1_0_, person0_.version as version3_1_0_ from person person0_ where person0_.id=?
Hibernate: select nextval ('hibernate_sequence')
Hibernate: select addressList0_.fk_person as fk_person1_2_0_, addressList0_.fk_address as fk_autho2_2_0_, address1_.id as id1_0_1_, address1_.firstName as firstNam2_0_1_, address1_.lastName as lastName3_0_1_, address1_.version as version4_0_1_ from person_address addressList0_ inner join address address1_ on addressList0_.fk_address=address1_.id where addressList0_.fk_person=?
Hibernate: insert into address (firstName, lastName, version, id) values (?, ?, ?, ?)
Hibernate: update person set title=?, version=? where id=? and version=?
Hibernate: insert into person_address (fk_person, fk_address) values (?, ?)

6. Setting wrong value for hibernate.hbm2ddl.auto

The property hibernate.hbm2ddl.auto is used to customize ddl generation of your project during deployment. The possible values are :

validate : it validates the existing schema, making no changes to the database

create : It deletes all your existing schemas in the database and creates the specified schema

update : it updates the existing schema

delete-drop : it creates the schema, performs operations and deletes the schema

It is best to use validate on production environments since one should avoid relying on hibernate for critical database operations. Other operations can result in losing data or even a small mistake like typing error may break your application. It is best to use migration tools for schema creation or updates and then just validate your schemas using validate property.

7. Trying to save an associated object which is fetched from cache or is out of current transaction :

If you retrieve an associated object from cache and assign it to parent object, saving the parent object will give the error “Detached entity passed to persist”. The cause of this error is that hibernate maintains a session for all database transactions, and if it does not find the entity belonging to current transactional context, it considers it detached. Since we are fetching the entity from cache and not actual database, hibernate does not find it associated with current transaction, and hence the error. To avoid this error we can make use of merge strategy provided by entity manager.

As you can notice, these are very small mistakes but can have huge impact on performance and result in unexpected behaviour. These are few things to watch out for while development and one can easily avoid few common mistakes one may be doing while using hibernate.



Also published on Medium.

Leave a Reply

Your email address will not be published.