My first computer had 64MB of RAM, since then technology has improved and got a lot cheaper. We can get Terrabytes of RAM easily.
But when i talk about storing TB of data on a single java application/process,
I get reaction like are you insane or what!! TB of data on Java application, it wont even start and if it gets into GC (Garbage Collection), you can go and have a coffee at Starbucks even then it wont finish.
Then I say, BigMemory is the saviour you don't have to worry about GCs any more.
But still can BigMemory store TBs of Data without any performance degradation?
Here is my experiment, i tried loading 1 TB of data on a single JVM with BigMemory.
Tried loading 1 Trillion (yes, you read it correctly its thousand times a billion, which we call as trillion) elements of around 850 bytes of payload each. Total data is ~900G, hit the hardware limitaions, but we can sure make it more than TB if hardware is available.
Came across a huge box with 1TB of RAM, which made this happen. To reduce any GC issues, reduce the JVM Heap to 2G.
The test create an Ehcache and loads the data onto it. ehcache configuration used for the test.
<ehcache name="cacheManagerName_0" maxBytesLocalOffHeap="990g"> <cache name="mgr_0_cache_0" maxEntriesLocalHeap="3000" overflowToOffHeap="true"/> </ehcache>
Here is the graph of period (=4 secs) warmup thoughput over time. Secondary Axis of the chart show the total data stored.
There are few slight dips in the chart these are when BigMemory is expanding to store more data. The Throughput is above 200,000 all the time with an average of 350,000 txns/sec.
The latencies are also a big concern for the applications.
Its below a 1 ms and average being 300 µs.
Okay, we have loaded TB of data, now what. Does it even work?
Yes, it does. The test phase does read write operation over the data set. Randomly selects an elements updates it and put it back to the cache.
I will say throughput and latencies are not that bad :)
The spikes are due to JVM GC, even with 2GB heap we will have few GCs, but the pause time is not even 2 secs. So we get the max latency for the test to be around 2 secs but the 99 percentile is around 500 µs
So if you application is slowed down by database or you are spending thousands of dollars on maintaining a database.
Get BigMemory and offload your database!
There would be concerns about searching this huge data, we have ehcache search that makes it happen.