Strange that at this level of hardware the HN zeitgeist views firmware replacement as a flaw†. At a higher level, say the cell phone or desktop computer, there is a sentiment for "if you can't replace its software, it isn't yours".
There seems to be a threshold under which a device ought to just do what you expect, what the manufacturer decreed it would do and no more, even if you own it and it could do more. I propose that this level varies widely across individuals.
␄
† Granted, this capability is undocumented. But if it were documented on that scrap of paper that fell out of the packing materials, in 4pt type, using gray ink on not terribly white paper, would it be that different?
I don't see it as a flaw, but now that it's brought to light I think it deserves more investigation. I'd really like to see an open-source SD controller, maybe even one of the SD-to-raw-flash type.
But on the other hand, I think we should be enjoying the relative freedom of today (and trying to preserve it for the future); it seems too many are trying to spin "security" as something beneficial, when what they are really saying is "we're making things secure against you and taking away your freedom so we can control what you do; it's also effective at securing against attackers, which is all we're going to promote". If this line of thinking continues we may see devices in the future that are even more locked-down and user-hostile.
(FYI I've worked with embedded systems for quite a bit and also knew SD cards had firmware in them that could be modified, but never really investigated it - just put it in the back of my mind as one of those "I'm curious enough that if I had the time I'd have a go at it" things - along with several dozen others.)
Consider this: Someone hands you a SD card when you need to transfer some data between some devices. You think you delete it afterwards, but the SD card has hacked firmware that just pretends its been deleted. You hand the SD card back, and your "benefactor" now has your data.
It's a flaw if people are not aware of it. Most people see things like SD cards and USB sticks as "dumb" storage devices with no real ability to run software, and are totally unaware of the risks they can cause.
People (well, most reasonably aware people) understand that a cellphone is basically a small computer, and it has a processor and memory and storage, and executes code, etc. And it's nice to be able to update/modify that code to change how the device operates, and somewhat obnoxious when you can't. (To a limit: it's also obnoxious and dangerous when someone else can change that code without letting you know.)
In the case of SD cards, many people assume that they are "dumb" devices. They don't realize that they have a processor (microcontroller) which executes code, and that it's not functionally equivalent to a floppy / Zip disk / CD / pick-your-favorite-dumb-storage-metaphor.
I don't think this is the last time we're going to run into this issue ... an increasing number of devices have embedded, potentially-reprogrammable microcontrollers (laptop batteries, power supply bricks, headphones, to name just a few) that could be used as attack vectors, or as platforms for cool hacks.
The solution, IMO, is not to just further obfuscate the programming method, but to make the code easier to inspect/validate and maybe even reflash, so that users can ensure that the devices are running what they think it's running.
There seems to be a threshold under which a device ought to just do what you expect, what the manufacturer decreed it would do and no more, even if you own it and it could do more. I propose that this level varies widely across individuals.
␄
† Granted, this capability is undocumented. But if it were documented on that scrap of paper that fell out of the packing materials, in 4pt type, using gray ink on not terribly white paper, would it be that different?