Sublime Forum

All Line Endings are converted on file save

#21

Then once detected a file is within multiple line endings, instead of forcing it to, ask the user to choose between converting the line endings or disable all the features not prepared to deal with this file.

This enters on the same category as on the performance for increased big files. Within the required architecture changes for Sublime Text to support them, you may easily use it for opening mixed line ending files.

Related threads:

  1. #9832 Sublime Text performance with very large files
0 Likes

#22

The responses in this thread are absurd and I believe to be inconsistent with standard expectations from ST. When you open a file in a ā€œhackerā€ text editor like ST, you expect the editor to behave appropriately. I donā€™t believe that itā€™s impossible for you to just retain the line endings the way they are until they are changed. If you have to literally change the data to meet your coding practices, your coding practices are incorrect. You need to fix the libraries you use to handle line endings so that you can leave the data alone.

Hereā€™s a scenario where this problem breaks: analyzing a file for inconsistent line endings. I use ST to do a lot of things, one of those things is analyzing the data structure of files. I work with really large files so often running the entire file through the hex converter isnā€™t optimal or even possible. Additionally, running the entire file through the hex converter requires to you actually look through then entire hex structure of the file. So, to get around the filesize and reduce the scope of my search I: open the file in ST, get rid of all except for the snippet of text Iā€™m trying to look at, and then save the file as a snippet (text file but smaller). I then convert the snippet to hex using the hex plugin. This has seemingly worked fine for as long as Iā€™ve been using ST to do this. However, recently I was analyzing a file that had inconsistent line endings. The inconsistent line endings was causing SQL Server BULK INSERT to skip rows where the line ending isnā€™t correct. ST obfuscated the problem by making the file look as though it had consistent line endings. I ended up chasing my tail around for months because this problem wasnā€™t clear.

Correcting the problem with SQL Server IS as simple as exporting the file with consistent line endings. Unfortunately, that requires that the problem be identified in the first place. And that is where ST failed me.

1 Like

#23

Two things:

  1. I want to register my frustration as well that ST can not be prevented from making more changes to a file than what I specifically specified. There are situations where multiple constraints corner someone into a situation where this is unworkable.

  2. I can appreciate the technical problems @wbond detailed and Iā€™d like to sincerely thank him for providing a reasonable answer for why this feature will never be supported. There are all kinds of trade-offs in software development, and I can understand that it may legitimately be the case that the effort to support these use cases outweighs the customers lost as a result of not supporting them.

    The official responses prior to @wbondā€™s basically amounted to ā€œyouā€™re wrong for needing this featureā€, which is less-than-helpful and even arguably not true. On the other hand, ā€œWeā€™re not ever going to support this use case; you will need to find another tool: Closed - Wonā€™t Fixā€ is, in my view as a developer, a valid answer.

</$0.02>

0 Likes