Comments

Clive Robinson March 28, 2023 10:50 AM

@ Bruce, ALL,

“… have vulnerabilities that allow people to partially recover content that was edited out …”

This is effectively true of not just image editing tools, but all edit tools with an undo/recover feature.

From the earliest line editors that “saved edits” through to modern multimedia editors, there are “undo” features of one form or another, because thay’s what falable humans think is a highly desirable feature[1].

Way more than they think “backups” are a desirable feature…

So why should the designers and developers of such tools actually “delete” rather than “not display” how things were?

After all they know that,

“To err is human, and that means that a users shortcomings must by user logic be the designers failure to anticipate.”

So the designers and developers “leave it in” as much as they can…

For a user, who has got their work the way they want it to look, they never stop and think about what they can not see…

The simple fact is there are several ways to solve the “don’t leave anything behind” issue, the easiest being “print out and then photocopy/scan”.

It’s one of the reasons I always say,

“PAPER, Paper, NEVER data…”

And have been saying it since quite a ways back into the last century…

One day maybe… people will learn that their need to have their shortcomings rescued is a sharp double edged sword that cuts both ways…

That is when they need it, and as a consequence, when those who are against them need it…

[1] Way more desirable than making proper backups… Such is “short term thinking” and basic “laziness”…

pedro.frazao March 28, 2023 11:53 AM

Probably, this not a bug, only a misalignment of expectations.

All the software for edit photography do a nondestructive editing. Every single photographer expects that.

If a photographer want to share an edited photography, he will export it explicitly, to a new file. This new file will have only the result image.

Clive Robinson March 28, 2023 11:57 AM

@ ALL,

The simple version of “a crop alypse” has been around long before the IBM PC existed…

That is it was well known amongst Apple ][ owners, and those with 8080 CP/M machines. The same truncation mark mistake was known to happen on VAX and *nix OS’s and their predecessors back in the 1960’s and 70’s. And later many programs using non standard C libraries or buffered file writes have done this from the 1970’s onwards[1]. So have been “built in features” of other programing languages (like some implementations of FORTRAN).

When MS-DOS (rip-off of CP/M) came along it had a “text editor”, this used a marker for the end of text of Ctrl-Z. As the Debug “file and memory editor” also came with MS-DOS you could clearly see the “end of text” marker inside the file or memory buffer, as well as the rest of the disk file or buffer end block[2].

So there is nothing in the slightest new in this “bug” method of valid data beyond an end marker, and you would have thought it would be “well known” to developers after around half a century[3]…

Yet as normal in the ICT industry, wr don’t appear to learn from our history…

I guess more people should start asking why?

Especially as I know this end marker problem still exists in many many storage files, and they are not hard to find…

[1] That is as we know a C-string can be any length that will fit in memory. The end of the string is marked by 0x00 ASCII NUL charecter. So if you have a 1024byte long string in a buffer and you only need the first ten bytes writing 0x00 at position 10 is a fast way to do it (ie first position is 0 so 10 is actually the 11th position). However when the buffer gets written out to disk, the fast way is to “write in blocks” so if the storage blocks are 512bytes, then the first half of the original string gets written to storage with the only difference being 0x00 at the 11th position.

[2] Which could be a handy place to put things out of sight, because it would only get overwritten if the file was extended far enough (as FAT-12 gave way to larger FAT’s the size of disk file blocks went up to around 8192bytes so tucking say a password away at the end would probably not get overwritten in a “readme file” that was designed to be well short of a block multiple.

[3] Especially as it’s the way FAT “File Undelete” worked. That is rather than delete a file, MS-DOS just over-wrote the first charecter of the file name in the “File Alocation Table” so undeleting a file was simply a case of usinf Debug to change that first charrcter to a printable ASCII charecter and getting the user to change it to the correct charecter using a DOS of other File manager “rename” option.

Jimbo March 28, 2023 12:15 PM

For Windows, why not open the file in paint (which has much better editing), edit as needed and then use snippy to capture the edited image from the screen? Save to a new file (with a new format if needed). Snippy wont have any bytes from the original file so there wont be any reverse editing.

Peter A. March 29, 2023 4:48 AM

@Clive: at least vi does not do such nonsense, it rewrites the file anew every ZZ or :w and deletes ‘scratchpad’ file at exit 🙂

The filesystem, however, just marks the space free and happily churns away. The undelete on more modern and complicated filesystems is not as easy as on ancient and simplistic FAT, but possible – been there, done that, on some old UFS, on ext3, even on Veritas FS. The last one was a bit of a nightmare, but saved transferring many tens of gigabytes over a busy 1 Mbps link after a strange crash, which left the filesystem almost full with no files visible, fsck did not help.

Nobody’s infallible and sometimes even daily backup schedule is not enough, when you drop the rm -rf bomb on the wrong target or do some other stupid thing in a hurry.

Clive Robinson March 29, 2023 7:45 AM

@ Peter A., ALL,

Re : Nobody’s infallible

All humans err that is accepted by most, but as the old *nix version of the joke has it,

“It takes a computer to realy ‘fsck’ things up…”

It happens increasingly because most people forget that computers are a product of man kind (A point a UK Law Lord made agaonst British Gas who were endlessly harassing a lady, and when challenged used the old “The computer says” excuse).

As engineers we have spent millions of man hours of the past half century or more trying to make the “electronics” more reliable and found that we just ran into other problems like metastability and little gifts from space passing through. So we went in other directions using parity, redundacy and even word puzzels from several millennia ago.

The problem now is that we’ve made things so reliable in some senses that two things have happened,

1, We’ve become overly reliant.
2, We nolonger have ephemerality.

As a result society has become an almost alien place to those born before the transistor radio took over from the valve/tube radios.

We now have got to the point where “unreliability” has become a necessity for economic growth…

It’s not much talked about but the history of light bulbs and their manufacturing and the set up of cartels to deliberately stop reliability to “maintain market growth” tells a lot. For instance we’ve gone from tungstan filaments to LED’s yet as if by magic the reliability is still around 2000 hours. Any electronics engineer can tell you that it should be around 500,000 hours… So why is it not? The simple answer is the manufactures design new LED bulbs to “burn out” by using parts in a way to make them fail, so they can keep selling light bulbs…

Whilst people might be shocked by this, they don’t think about the flip side… Part of the reason we can buy light bulbs is that the cost of manufacture is sufficiently low. If the reliability went up to 500,000 hours that would be an equivalent life time of ~250years no manufacturing system running at reasonable cost to the consumer could survive due to the cripling effect of “fixed costs” and similar…

However in computing we have kind of gone over the reliability hump and the only reason we have computers still manufactured is a little strange (remember it’s actually a deflationary market as far as basic economics is concerned).

Whilst I do use more modern OS’s for what I need to do –some unique to me– I still use Win XP and Office 95 to “prettify” “text” for other peoples “expectations” (though banging it out in crude HTML in an editor works more than well enough).

Likewise I use CSV and still MicroSoft’s “Rich Text Format”(RTF). Even though RTF has not had any love from Microsoft or others for over a decade and a half now. I even still use the earlier file format of “WordStar” and happily chug along in WS4.x on Microsoft MS-DOS 5 on some of my more “gentrified” computers.

Which is why your mention of ‘vi’ and the fact it does not do some of the dumb things more modern developers and programers “build in”. Whilst I have used and still do use ‘ed’ I never realy took to vi as a work tool as WordStar “key-bindings” became the default in early IDE’s in DOS (and can still be found in supported use).

The point is vi has survived by the “snow ball down a hill” process and as such became a “standard” and still is. Over time many of it’s faults have been not just fixed but effectively expunged from nearly all “living memory” and learning it a “right of passage”. What others mainly don’t realise is that it encorages a “seperation” that ends up in the way people think and do things at an almost mechanical level. That is vi tends to make you put your thoughts in order so your work flows, rather than keep pecking at it for visual style… That in turn encorages more structured thinking, which works it’s way up and tends to produce higher productivity in people it suits. So I suspect vi is going to still be in regular use long after both of us nolonger care 😉

But the flip side of “hand holding” that most “productivity tools” do, is they keep users in kindergarten mode, that they don’t grow out of during their teens and twenties, and by the time they are in their thirties for most “It ain’t going to happen”.

So they don’t learn why the scissors are not realy usefull because of all the protection around the blades. They just accept the protection and don’t progress to “craftsmenship” or understanding of the dangers of “edged tools”…

With computers it’s now become so hard to not have everything kept, that law enforcment and the legal proffession see “computers” and other “ICT tools” as the way to an easy success…

It also means that certain types of “crime” become almost ridiculously simple as well… Hence ID theft, impersonation and even scapegoating become easy for those that “know the system” and virtually impossible to comprehend for those not “in the know” which is by far the majority by a very very long way…

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.