Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Abstractions usually (always?) have a cost because physics.

> The damned thing even ran sendmail

and?

> Cloud computing really still it "somebody else's computer."

That's the definition of 'the cloud'. Unless you run it locally in which case it's your computer. What's your point.

> There's no "OS" (in the philosophical sense) for treating remote resources truly abstractly

It's unclear what you're asking for. Treating stuff truly abstractly is going to get you appalling and horribly variable scalability. If you're aware of that, why don't you tell us what you want to see instead.

Edit: ok, this is from gumby, now I recognise the name. This guy actually knows what he's talking about, so please tell us what things you would like to see implemented.



>> Cloud computing really still it "somebody else's computer."

> That's the definition of 'the cloud'. Unless you run it locally in which case it's your computer.

Forget the stupid framing of idiotic marketers in the early 00s and go back to the original “cloud” definition (that engineers were still using in those ‘00s but was distorted for a buck).

The term was introduced (by Vince Cerf perhaps) in the original Internet protocol papers, literally with a picture of a cloud with devices at the edge. It was one of the revolutionary paridigms of IP: you push a packet into the cloud (network) but don’t need to / can’t look into it and the network worries about how to route the picket — on a per-packet basis! You don’t say “the cloud is other peoples’ routers”.

Today’s approach to remote computing requires developers to know too much about the remote environment. It’s like the bad old days of having to know the route to connect to another host.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: