- 3,000 entries
- JVM shows that only 50 GB of data was occupied (60 GB is allocated for the application)
- takes approximately 20 minutes to complete entire test plan
There was 10 GB of memory left available and unused, so I wanted to maximize usage. After doing some math, (3,000 entries/50 GB = 60 entries/GB), I increased the number of entries to 3,500 (just to give a little leeway). Strangely, whenever I did this, even when increasing the number of entries to 3001, I would have to wait over an hour for the test plan to complete. The results did not look good either. I was getting a relatively high percentage of errors.
Since I am using JMeter for this performance test, I used the SampleResult class to configure my data collection. I set the
jmeter.save.saveservice.samplerData=true
in bin/jmeter.properties. I used a "View Results Tree" listener to see the data collection. As the test was running, once it reached the GET transactions, I noticed it would fail immediately on the first transaction. The error was a validation error exception.
I then looked at my cache configuration xml file. I finally found the cause of this issue! I had set the
maxEntriesLocalHeap=30,000,000
, which is fine for 3000 entries (of size 10,000). For any added entry, the cache size isn't big enough, so I using the memoryStoreEvictionPolicy
(mine was set to LRU), it would override old entries (I presume that later GET transactions would actually succeed since it doesn't override the later X amount of entries). This was the cause of the issue! I increased the size maxEntries, and it solved the problem! Ta-da!
No comments:
Post a Comment