Sublime Forum

Extreamly slow while handling large data


I would like to ask if anyone is facing this issue where it is extremely slow to open / view / edit / save large text files (json, sql, csv, with 1GB and above in size). I have tried opening a 2GB JSON file and it takes nearly 4 hours to completely open the entire file but once the file is finally open, it crashes.

FYI I am using Sublime Text 3 version 3103 but this problem has been here since Sublime Text 2

1 Like


First all editors struggle with large files. Try editing a 2Go text file in notepad!

Edit: Most editors struggle with large file except if they have been carefully designed for them.

I often open big text files in Sublime (>1Go) and it handles them reasonably well.

But I usually don’t open big source files. When opening your ‘json’ Sublime will parse it, which can be long.
Also if you have line wrapping enable it will take you even longer.
And saving is long because Sublime has to write a 2Go file on your disk, and that’s long.
If Sublime crashed my guess is that you run out of RAM. Loading a 2Go files would take at least 2Go of RAM in my experience.

Is there any way to automatically disable some options like Syntax and line wrapping on big files?

Also it would be nice to be able to open a big file but to only load a few parts of it. (Maybe there already is a plugin for that, I never looked up)

I also experienced that selecting all ‘foo’ and typing ‘bar’ is much slower than doing a ‘search and replace’ through the menu.



I see… when you meantioned about disks and RAM, will it be an affect since I am using:

Intel Core i7 6700K
2x Samsung 950 512GB NVMe SSD Raid 0

This setup is used solely for simulation purposes and data altering…



That’s a pretty good setup! It seems unlikely that you runned out of RAM with 64Go, but you might want to monitor your RAM usage while you’re opening your Json file.

Is your Json valid?

And even if you have a SSD harddrive it would still be long to save a file. How much does it take to copy paste a big file versus modifying and saving it in Sublime?



I always edit large text & csv files in my old v8 of Ultraedit. Sublime performance is much worse in my experience with GB size files

1 Like


The only editor besides Vim I’ve come across to open massive text docs with ease is BBEDIT



A quick ‘hi’ as I’ve been off the map for a little while. Returning to the topic at hand (and I’ve commented on this a few times before over the years) Sublime’s performance with either very large files (1Gb+) or files with very long lines (10k+ chars) leaves a lot to be desired. Generally I avoid using Sublime for such tasks and have to fall back to Crisp or Vim which handles pretty much anything with lightning performance.

@gwenzek respectfully disagree on the idea that editors always struggle with huge files. Those whose engines have been coded to handle them well do so with ease and there are numerous examples. Crisp, for example, demand-loads such files so editing an 8Gb file is no problem on a machine with 2G RAM.

Returning to Sublime, setting syntax to “Plain text” often helps, since this effectively disables the heavy duty regexes required for scoping/colouring source, along with disabling wrap & highlight matches on S&R.

Sublime could do with some improvements in its editing engine, along with some adaptive intelligence with respect to handling large files and anything else which might be performance blocking. The reason? I’ve maintained before that Sublime is a tough sell if it can’t be a “swiss army knife” editor for programmers and hackers. One of the jobs we all need to do from time to time is work with huge files (usually log files in my case) and it’s a shame that I can’t be in Sublime’s environment (with its plentiful sugar) when doing so.

By “adaptive intelligence” I mean things like offering an abort when Sublime is doing anything that blocks the UI for a while - whether that be opening huge files or using plugins that end up doing more work than intended - changing editing engine setup when working with huge files, disabling wrap as mentioned above, disabling ‘highlight matches’ for S&R, disabling colourisers etc. Since such parameters vary to some extent with hardware, it’d be nice to have thresholds configurable. Such improvements would make Sublime much more robust when working with large files, avoiding the ‘doh’ moments when you do an S&R on a big file and the whole UI locks up while it tries to highlight every match in the file.

Bottom line: Sublime isn’t great with big files. If you’re going to, turn off wrap, set plaintext and turn off highlight matches on S&R dialogs. I’d love to see some work in this area and though I know it’s a chunk of work it would secure Sublime as the clear choice for programmers and hackers expecting high performance, powerful editing across all text file types and sizes.