Persistence with Hibernate/JPA where object identity is important

问题: I am using Hibernate/JPA as the persistence backend to what essentially boils down to a mod to a game written in Java. In this context, it is very important to me that I...


I am using Hibernate/JPA as the persistence backend to what essentially boils down to a mod to a game written in Java.

In this context, it is very important to me that I query the database as rarely as possible on the main thread. Doing so asynchronously, while possible, would be impractical as I would have to call methods of game objects from other threads, which more often than not will not work. This means I have to do as many things as possible in memory using cached objects as I can, to maximize performance ( as working with memory would be faster than having to wait for a query to return results from the database ).

Say I have entities defined as follows:

class Town {

    @Column(name = "id", updatable = false, nullable = false)
    private Long id;

    @OneToMany(mappedBy = "town", fetch = FetchType.EAGER) // use eager fetching to save on having to query the database later for this
    private Set<Resident> residents;

    // ... other fields and associated getters/setters

class Resident {

    @Column(name = "id", updatable = false, nullable = false)
    private Long id;

    @ManyToOne(fetch = FetchType.EAGER) // use eager fetching to save on having to query the database later for this
    @JoinColumn(name = "town_id")
    private Town town;

    // ... otehr fields and associated getters/setters

My question is the following:

If I were to retrieve all Resident entities using Hibernate, and store them in memory ( say, using a HashMap ), and if I were to then proceed to retrieve all Town entities using Hibernate and cache them the same way, will calling Town#getResidents() return references to some of the same objects in memory as are present in the Resident cache?

Essentially, does Hibernate re-use still-valid objects which have previously been returned in queries to populate newly created collections?

I would also not be against any criticism of my general approach or advice on how I could improve it. Thank you in advance! :)


Caching is a really complex topic. You should not have to take care of caching by yourself. That's what hibernates second-level-cache is for.

One of the advantages of database abstraction layers such as ORM (object-relational mapping) frameworks is their ability to transparently cache data retrieved from the underlying store. This helps eliminate database-access costs for frequently accessed data.

You still have to configure your Entities to be cacheable and how aggressively hibernate should cache, but the rest will be handled by hibernate

@Cache(usage = CacheConcurrencyStrategy.READ_WRITE)
class Resident {


If heap consumption is not a problem, or the produced instanced are not that much, your approach is not bad. I see you're already using FetchType.EAGER, that was the important part.

I'd say you don't even need to retrieve Resident(s), you can just collect the residents Set<Resident> of each Town.

Once all the instances have been retrieved, I'd also explicitly EntityManager#detach them.

And yes, Hibernate maintains multiple levels of caching. See documentation.

If I may ask, why are you using JPA? Wouldn't a more low-level approach, maybe using MyBatis, be a better approach, after all? Relying on a heavyweight framework such as Hibernate isn't overkill?


I disagree with the accepted answer with regards of the Caching. I have another answer where I have explained in details why I dislike hibernate 2nd level caching hibernate second level cache with Redis -will it improve performance? Usage of hibernate second level cache is by far not a common caching strategy. There are several reasons for this:

  • Hibernate second level caching is very inefficient. It uses the default java serialization which is terribly slow and terribly memory inneficient.
  • Using hibernate second level cache very often you need to maintain the consistency of your relationships. One such example is when you need to remove an element from a collection. Maintaining consistency is not a big deal in general if you use simple pojoes , but when you start mixing up your persistence logic with your caching it starts getting really annoying.
  • If you decide to go from pure second level cache to a distributed one with hibernate. The complexity will skyrocket not in a good way, then you will learn the hard way why is hibernate caching inefficient.

I would quite the opposite of the accepted answer advice you to separate your caching from your persistence into simple Pojos. And manage the caching through these Pojoes.

Now with respect of you model. I don't know what functionality you are covering, but I strongly doubt anyone will ever fetch a town with all of its residents. I would advice you to remove the OneToMant relationship from Town to Resident. Based on this I see the following scenarios:

  • Resident centric data processing and you may have repetative hits on a resident. You may decide to cache the complete resident plus the town, or alternatively if you don't hit the same resident you may decide to cache only the town.
  • Cache both the town and the resident in the same region and your processing is resident centric. You have the option to cache together under the same key that is your resident both the Town and the resident, you will sacrifice some memory yes. But you will have a direct hit on both memory and town in one go.
  • two cache regions for Resident and Town, but then you need to perform two lookups for 1 resident. In terms of memory is more efficient in terms of performance not realy.
  • Cache only the town. In any case no matter what you decide. I would personaly not go with hibernate second level caching :)
  • 发表于 2019-02-18 20:20
  • 阅读 ( 179 )
  • 分类:sof


请先 登录 后评论


作家榜 »

  1. 小编 文章