Now Reading
a 1.5GB string – BackSlasher

a 1.5GB string – BackSlasher

2023-04-08 14:26:55

In my earlier position, I supported a Java service that operated equally to RDP or Citrix by enabling distant UI performance. This service relied on periods, which consisted of interconnected Java objects that had been presupposed to be cleaned up both when a person logged out or after a predetermined timeout interval.

Throughout the course of our capability planning, we found a big reminiscence waste that I needed to share with you.

Capability Planning

A part of my routine work with the group included capability planning for the subsequent 12 months.
By analyzing our utilization metrics, progress patterns, and inhabitants analysis, our knowledge scientists had been in a position to predict what number of customers we might count on to have within the coming 12 months.

To find out the mandatory infrastructure required to assist this anticipated person base, we employed a classy system:

[text{Number of Servers} = { text{Number of Users} over text{Users per Server} } * text{Safety Buffer}]

To know what number of servers we have to have for subsequent 12 months.

Certainly one of our capability planning periods revealed that, as a result of immense reputation of our service, we had been anticipating a big progress within the variety of customers within the coming 12 months. Our calculations indicated that we might require extra servers than we had obtainable to accommodate this elevated demand. Consequently, we had been confronted with the problem of determining the right way to match extra customers onto every particular person server so as to assist the projected person base.

What are we certain on?

With capability measurement, we are able to pinpoint the bottleneck in our system, and on this case, it’s the reminiscence. As extra customers are added to the server, the system begins to falter underneath the elevated load, finally working out of reminiscence. Understanding we’re memory-bound is essential, because it directs our efforts in the direction of lowering reminiscence consumption so as to accommodate extra customers on the server.

Investigating reminiscence utilization

We had a crude estimation of our per-user reminiscence consumption utilizing this:

[text{Per User Memory} = { text{Server Memory} over text{User Capacity} }]

Utilizing imaginary numbers, we are able to say one thing like:

[text{Per User Memory} = text{300MB} = { text{90 GB} over text{300} }]

So we are able to approxiamte per-user reminiscence requirement as 300MB.
With a view to perceive the right way to cut back this quantity, we went into extra critical reminiscence measurement.

We started analyzing the Java reminiscence dump of our servers to establish potential areas for enchancment. Initially, we reviewed the dumps manually, however as a result of sheer variety of servers, we developed a customized script to automate the method. Utilizing this script, we had been in a position to establish memory-wasting objects that had been attributed to particular periods. By pinpointing these points, we are able to successfully eradicate the waste and optimize our system’s reminiscence utilization.

I’d cowl the script and evaluation in one other publish, however for now I wish to concentrate on a selected fast win the reminiscence evaluation gave us.

A really massive string

We began with going over our hundreds of memdumps and searching for very massive objects. Our greatest whale was a 1.5GB string. It seemed one thing like this:


In case the image didn’t convey the message, the string contained many many backslashes. We discovered related smaller ones, however this one was the largest.

Investigating what the aim of the string was, I noticed that we had lessons that seemed like this:

class Display {
  personal Display earlier;

  public String toJson() {
    JSONObject jo = new JSONObject();
    if (earlier != null) {
      jo.put("earlier", earlier.toJson());
    return jo.toString();

class Session {
  String currentScreen;

  public void setUrl(Display s) {
    currentScreen = s.toJson();

So every display screen has the earlier display screen the person visited, to permit the person to go “again” and get the precise display screen they had been in earlier than (state, scrolling place, validation notices and so on). The person session additionally has the present display screen the person is in, so if the person reconnects to an present session, we are able to return to the place they had been.

See Also

There are two design issues right here:

  1. The “again” stack is limitless, that means we’re saving increasingly state till we explode
  2. by working jo.put("earlier", earlier.toJson());, we’re changing the JSON dictionary to a string. Since JSON fields have quotes, and people quotes must be escaped when saved in a string, they’re saved as ".
    That backslash must be escaped when this string is saved inside one other string, compouding into ". A pair extra rounds of this, and we get \\\\"

It seems {that a} person with a session with a lot of screens produced a currentScreen String of gigantic proportions.

Dealing with and followup

We divided the issue into a fast repair and a long-term one:

The fast repair was truncating the “earlier” string if it goes over a selected char quantity (e.g. not letting it go over 100MB).
Whereas this isn’t an entire answer and may impression the person expertise, it was very fast to implement and simple to check, boosting our reliability (stopping a selected session from inflating and bringing the server down).

The long-term repair was rewriting the “earlier” stack answer fully, making a devoted actual stack with self-imposed dimension limits and reporting.
It took a very long time to put in writing, and longer to check and slowly launch, nevertheless it actually prevented reminiscence waste, somewhat than solely conceal away whale-strings as one other type of reminiscence (e.g. very deep JSON objects).


We continued to make use of the memory-dump evaluation instrument and located extra nonsense we killed, however nothing as simple as this.

My essential takeway from this story is that generally, checking the main points of how your program makes use of sources (e.g. analyzing a memdump somewhat than simply measuring total reminiscence utilization) is essential for fulfillment and produces fast wins from the beginning.

Source Link

What's Your Reaction?
In Love
Not Sure
View Comments (0)

Leave a Reply

Your email address will not be published.

2022 Blinking Robots.
WordPress by Doejo

Scroll To Top