OutOfMemoryError: Failed to allocate when paging through responses


I’m receiving the following OutOfMemoryError exception try to page through responses from an API

Caused by: java.lang.OutOfMemoryError: Failed to allocate a 37748744 byte allocation with 25165824 free bytes and 24MB until OOM, target footprint 268261936, growth limit 268435456 at java.util.Arrays.copyOf(Arrays.java:3257) at java.lang.AbstractStringBuilder.ensureCapacityInternal(AbstractStringBuilder.java:124) at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:649) at java.lang.StringBuilder.append(StringBuilder.java:203) at com.richmondgamesstudio.inattaxonomytreeviewer.utils.INatCall.readAll(INatCall.java:105) at com.richmondgamesstudio.inattaxonomytreeviewer.utils.INatCall.restCalliNat(INatCall.java:56) at com.richmondgamesstudio.inattaxonomytreeviewer.utils.INatCall.restCalliNat(INatCall.java:75) at com.richmondgamesstudio.inattaxonomytreeviewer.utils.INatCall.restCalliNat(INatCall.java:75) at com.richmondgamesstudio.inattaxonomytreeviewer.utils.INatCall.restCalliNat(INatCall.java:75) at com.richmondgamesstudio.inattaxonomytreeviewer.utils.INatCall.doInBackground(INatCall.java:42)

Is there a more effective way of handling a large set of data through rest? My app calls an API that returns a user’s ‘entries’. The API’s max page count is 200. Most users will have more than 200 entries, and this has been fine for the most part. My app can page and account for that. However, there are some users that will have +2000 entries and my app is running out of memory trying to iterate through these larger sets. My questions are:

  • Is there a way to increase the amount of memory my android app can use?
  • What are some ways that I can optimize the below code to use less memory?

Recursive rest call

   private JSONArray restCall(URL url, JSONArray results) {
        Log.d(TAG, "restCalliNat: Start");
        InputStream is  null;
        int totalResults  0;
        int perPage  0;
        JSONArray newResults  new JSONArray();
        try {
            is  url.openStream();
            BufferedReader rd  new BufferedReader(new InputStreamReader(is, Charset.forName("UTF-8")));
            String jsonText  readAll(rd); //<-- LINE 56
            JSONObject json  new JSONObject(jsonText);
            newResults  json.getJSONArray("results");
            totalResults  (int) json.get("total_results");
            perPage  (int) json.get("per_page");

        } catch (IOException | JSONException e) {
            return null;

        if (results ! null) {
            newResults  concatArray(results, newResults);

        if (totalResults > page*perPage) {

            newResults  restCall(updatePageCount(url), newResults);
        return newResults;

At each new page, the new page gets concatenated until I have all the entries.

private JSONArray concatArray(JSONArray arr1, JSONArray arr2) {
    JSONArray result  new JSONArray();
    try {
        for (int i  0; i < arr1.length(); i++) {
        for (int i  0; i < arr2.length(); i++) {
    } catch (JSONException e) {
    return result;

Converts API response to String

private static String readAll(Reader rd) throws IOException {
    StringBuilder sb  new StringBuilder();
    int cp;
    while ((cp  rd.read()) ! -1) {
        sb.append((char) cp); //<--- LINE 105
    return sb.toString();


Turns out the authority that I was calling had a limit to how many pages you can request.

per_Page count limit was 200 results for page. But the page_count limit was page * per_Page > 1000. So it didn’t matter what the per_page count was. If the page * per_Page > 1000 than it would cancel the call. Even in the middle of a stream.

Answered By – JonR85

Leave a Comment