Tabbed isnt enough for me. I don't want an app. I want a website. I want my user-agent! My user agent is what I know, affords me URLs & extensions and a powerful UI with lots of options.
The PWA thing seems sick to me. Henry Ford listens to market research and rebuilds his car as a mechanized horse. For me, the UX is strictly worse in every way. (And I already knew how to put links to webpages on my home screens).
There does seem to be a browser display mode, but it's up to the app maker to decide for the user what mode the app will be in. Why?!
> Progressive Web Apps can run in various display modes determined by the display property in the web app manifest. Examples are fullscreen, standalone, minimal-ui, and browser.
It makes me so sad how much lesser a person a user of a PWA is. The utter lack of user agent, being cast to whatever is provided by the maker, is a horrifying loss. All to ape what felt to me like the descending losing old-guard technologies.
We push PWAs to iPads & Surface Go devices via Microsoft InTune for some of our clients today.
This path started out very nightmarish (circa 2020) but it's going much smoother today. One of our customers actually came back to us with a slightly improved process based upon the one we gave them. They switched from iPad to Surface Go and used some extra endpoint management to make the PWA experience into a sort of kiosk mode.
The #1 constraint for us is the quality of the environment-facing camera and the level of access we have to its capabilities via the browser. iOS/Safari started out extremely weak on this but is quite good today. I can get a solid 2k environment scan at 30fps from the rear facing iPad camera in Safari today. Things like 2D barcode scan and document capture are 100% feasible now. These items used to make us extremely nervous on product demos but we don't worry anymore.
We almost capitulated and went back to native iOS apps because of the camera issues, but the pain of maintaining a native build chain when you are otherwise a 100% Microsoft shop (with barely 3 developers) was pushing back hard too. We were signing enterprise IPAs for all of our clients for half a decade before we switched to web/PWA. I will never go back to native apps. I'll find a different career path and hobbies if the web goes away.
I don't have a clean answer for B2C other than... I use HN and Twitter in Safari and I don't even process that it's not a native app. Neither of these web properties had to spend a single second worrying about a native app to acquire my business.
The exact way in which git handles commits is very muddied - it's snapshots on the surface, a bit of diffs when packed and a lot of operations on commits are actually 3-way merges (including merges, rebases, cherrypicks and reverts). Keeping track of all these matter (esp the operations that use diffs), but it can also get overwhelming for a tool.
In my opinion, it's probably good enough to understand the model git is trying to emulate. Commits are stored more or less like snapshot copies of the working tree directory with commit information attached. The fact that there is de-duplication and packing behind the scenes is more a matter of trying to increase storage efficiency than of any practical difference from the directory snapshot model. Meanwhile, the more complex git operations (merges, rebases, reverts, etc) use actual diff algorithms and 3-way merges (way more often than you'd imagine) to propagate changes between these snapshots. This is especially apparent in the case of rebases, where the snapshot model falls completely on its side (modifying a commit will cause the same change in all subsequent commits).
This actually makes sense if you consider the development workflow of linux kernel before git. Versions were directories or on CVS and a lot of development was based on quilt, diffutils and patchutils. Git covers all these use cases, though it may not be immediately apparent.
Added later: It's also interesting to look at Mercurial's model. Like Git, Mercurial uses both snapshot and diffs for storage. But unlike the Git way of layering these two, Mercurial interleaves them - as diffs with full snapshots occasionally. This is more like the video codec concept of keyframes (I think that's what inspired it). This means that Mercurial, unlike Git, doesn't need repacking. And while Git exposes its internal model in its full glory, Mercurial manages to more or less abstract it away.
"Here's my favorite example, here: 1928, my hero, Walt Disney, created this extraordinary work, the birth of Mickey Mouse in the form of Steamboat Willie. But what you probably don't recognize about Steamboat Willie and his emergence into Mickey Mouse is that in 1928, Walt Disney, to use the language of the Disney Corporation today, "stole" Willie from Buster Keaton's "Steamboat Bill."
It was a parody, a take-off; it was built upon Steamboat Bill. Steamboat Bill was produced in 1928, no [waiting] 14 years--just take it, rip, mix, and burn, as he did [laughter] to produce the Disney empire. This was his character. Walt always parroted feature-length mainstream films to produce the Disney empire, and we see the product of this. This is the Disney Corporation: taking works in the public domain, and not even in the public domain, and turning them into vastly greater, new creativity. They took the works of this guy, these guys, the Brothers Grimm, who you think are probably great authors on their own. They produce these horrible stories, these fairy tales, which anybody should keep their children far from because they're utterly bloody and moralistic stories, and are not the sort of thing that children should see, but they were retold for us by the Disney Corporation. Now the Disney Corporation could do this because that culture lived in a commons, an intellectual commons, a cultural commons, where people could freely take and build. It was a lawyer-free zone."
I lived in Iceland for several years recently. One thing you have to keep in mind, is this philosophy is pretty much born out of necessity. Until the US Military built a base in Keflavík (which later became an international airport), the country was quite behind the times. As one friend put it to me "When you were landing on the moon we were getting running water installed"
Iceland was a very poor country for most people other than a few well off in the fishing industry. Today it's still relatively "medium" with few people very wealthy, but also almost no one in destitution. The attitude is that everyone should be able to live a good life, and while it's not truly socialist, it's pretty darn close.
Þetta reddast is used somewhat interchangeably with "I don't feel like dealing with this" and "Some how things will work out even though it doesn't look like it now" and "fuck it". To most people it's kind of an in joke you say when something really sucks.
(Founder of windmill.dev, the closest alternative to Airplane and we are OSS)
Congrats on the acquisition Airplane team. You were a strong inspiration for us, a precursor and set a high-quality bar for pro-code developer platforms. I have nothing but respect for the Airplane team and we probably wouldn't exist in our current form without your competition.
We are ready to migrate all Airplane customers and the migration would be very smooth as many of Airplane concepts map 1:1 to our own concepts. If you need urgent migration, ruben@windmill.dev
One customer, nocd, migrated hundreds of scheduled scripts and workflows in just a few weeks and with only minor changes.
Our platform being fully open-source, you will never be at the risk of us sunsetting anything since you can fully self-host it (it is not an hybrid deployment model like Airplane where the Control plane are in the cloud). We are used by thousands of businesses including a few F500 at scale and can send reference over emails.
We are a smaller team, have raised reasonably and are close to break-even. As an open-source product, I keep in mind to resist the urge of raising too aggressively so we can keep control of our destiny and never betray our open-source principles of transparency and fair pricing.
It’s certainly been interesting watching the multi-decade arc play out. With Mach as the origins, everything other than tasks (processes) schedule and virtual mem was out of kernel and done over Mach port comms. Then xnu via next step and later OS X linked much more in kernel and exposed specific data types using com+ in iokit. And now more and more is moving back out of the kernel.
io_urging networking on Linux is another similar move out to use space
> Emulation on a modern X86 CPU will outperform any commercial available RISC-V processor at the moment
That's not true.
qemu-user is a little faster than the single-issue HiFive Unleashed from 2008, but qemu-system is slower.
Against either the dual-issue U74 cores in the JH7110 or the small OoO cores in the TH1520 and SG2042 qemu doesn't sand a chance on a core for core basis.
It used to be the case that qemu could win on x86 by throwing more cores at the problem, but with the 64 core SG2042 in the Milk-V Pioneer that possibility has disappeared too -- not to mention that the Pioneer is $1500 for chip+motherboard (need to add RAM and storage), while a 64 core x86 is $5000 just for the chip.
I don't think this is a "cartel of semiconductor manufacturers" so much as it's been a "shambolic cluster of organizations running crappy old fabs into the ground producing cheap chips that were subsidized by a prior decade's worth of very expensive products."
I can afford to sell gazillions of chips at $0.08 per chip if I'm running a fab I didn't pay to build. I'm only (barely) paying for the inputs. When Stan, the last guy who understands how to run the widget verifier, or Elaine, the last lady to understands how to run the polishing machine retire, I'll have to close up shop.
Those $0.08 per chip devices have been absurdly subsidized in that a replacement infrastructure to make them would require that they cost $10 per device, and the ecosystem of things built on $0.08 chips isn't viable in a $10 per chip world.
In order to have a fab make $0.03 per unit devices, you first have to have the fab spend 10 years making $300 per unit devices, regardless of the underlying node size of those $300 per unit devices.
Likely you couldn't even go back and make a fab that makes large volumes of 60nm-90nm node sizes at all, for any amount of money, because the equipment to do this (new) hasn't been made in 2 decades and no company is willing to invest the money to make new crappy old equipment.
It's not a nefarious oligopoly as much as a synchronized "run the asset to failure" lifecycle of the infrastructure.
How much does it cost to make a 300 year old tree?
Tesla has been kept alive only because they are scared to even let drivers take their hands off the wheel. They have very low confidence in the system to allow for driver inattention, let alone completely remove the driver. Numerous software “rewrites” over the years with many buzzwords attached, and they still haven’t clocked a single driverless mile.
They also have a huge fan base (investors) who are happy to babysit FSD in order to provide more “training data” and they’ve rationalized themselves whatever little benefit it offers is worthy of the product name.
It’s also worth noting that recording can be activated wirelessly by various triggers. The most obvious and common one being that a nearby officer’s camera was activated (either by physically pressing the button or via a chain reaction of wireless activations).
Depending on the available hardware/accessories & configuration, other sources of activation can include unholstering a weapon, aiming or discharging a taser, by computer aided dispatch, unlocking a weapon in the vehicle, activating the light bar, high vehicle speed, running, falling, crashing, and more.
In my opinion, if multiple officers are on the scene at least one axon recording device each: there is either video evidence or willful suppression of evidence. It’s that simple.
I found that using "helm template" to convert every Helm chart into yaml, and then using Pulumi to track changes and update my clusters (with Python transformation functions to get per-cluster configuration) made my life so much better than using Helm. Watching Pulumi or Terraform watch Helm watch Kubernetes update a deployment felt pointlessly complicated.
Vulkan 1.3 has pointers, thanks to buffer device address[1]. It took a while to get there, and earlier pointer support was flawed. I also don't know of any major applications that use this.
Modern Vulkan is looking pretty good now. Cooperative matrix multiplication has also landed (as a widely supported extension), and I think it's fair to say it's gone past OpenCL.
Whether we get significant adoption of all this I think is too early to say, but I think it's a plausible foundation for real stuff. It's no longer just a toy.
Yes, you're of course right—and at the same time, if I ask myself how to follow HN's core principle [1] in relation to this topic, I can't see "don't touch it at all" as right either. It may be an impossible quandary—but it's not in the spirit of this place to take an easy way out; or to put it differently, the easy way out (if one exists) is not in the spirit of this place.
What does "curiosity" mean in a context like this? It certainly needs to be more than just a technical dissection of details. I think it has to do with being open to learning. For that we have to be open to each other. And for that, we have to first find some space for the other within ourselves. Comments that have to do with annihilating the other (including in virtual form, such as by defeating them in internet battle) are therefore off-topic in a thread like this, as I posted above.
(Edit: there's also a kind of curiosity in walking into the impossible to find out what's doable; and also in taking a different approach with each attempt—which is why my pinned comment in this thread is different from last time.)
In my experience, it's fear over ego. At the last company I worked for, the CEO discovered Marty Cagan and immediately switched the entire company over to following his recommendations. I was on a product team, and we came up with our own strategy to meet our OKRs. The CEO didn't like our approach, and decided that, actually, he was going to both set the objectives, and results, and tell us how to achieve them. We were basically a feature team.
But it wasn't ego, it was fear. The company was running out of runway, we just hadnt been told yet. The CEO adopted the product team strategy as a Hail Mary, out of sheer panic, and he sabotaged it for the exact same reason: he was terrified of not trying it, and then terrified of it not working.
Not only because of this anecdote, but because of several other important ones, I view leadership as being generally fear-driven in their decision making. Ducklike, they appear calm and even confident on the surface, while desperately thrashing and failing below the surface. Or, like someone falling down a hill and reaching out for anything to hold on to.
The Arcan display server is a really cool idea. Even if it doesn't manage to get popular, I think there are ideas here that we could mine to use them in popular programs.
This is what I constantly tell my students: The hard part about doing a tech product for the most part isn't the what beginners think makes tech hard — the hard part is wrangling systemic complexity in a good, sustainable and reliable way.
Many non-tech people e.g. look at programmers and think the hard part is knowing what this garble of weird text means. But this is the easy part. And if you are a person who would think it is hard, you probably don't know about all the demons out there that will come to haunt you if you don't build a foundation that helps you actively keeping them away.
As someone who has been working on porting a desktop environment to Wayland for the past year or so, I don't hate Wayland, but I certainly don't love it. After 10-15 years, it still seems pretty half-baked. Some pretty fundamental-feeling protocols have been languishing as merge requests on git.freedesktop.org, some for 3+ years at this point. I get that you don't want to rush standardization of things that need to remain stable and backward-compatible for (hopefully) decades), but c'mon, this is getting to be a bit much.
At this point I cannot actually replicate Xfce feature-for-feature in Wayland, not without inventing private protocols of my own, which I'd prefer not to do. My biggest issue right now is xfce4-panel, and embedding surfaces from one process into a parent process. There's no Wayland way to do this, and the Wayland folks have explicitly rejected some sort of XEMBED-workalike Wayland protocol, instead telling people they should write an embedded Wayland compositor. Well, I've been working on that for the past eight months, and it's hard to get right. I have something that "works", but it has a lot of rough edges, and, critically, eventually either the parent app or embedded apps run into weird spurious protocol errors and quit. While yes, that's technically my fault and my bugs, working with an under-documented protocol and really bare-bones libraries for building apps (wlroots, sadly, is not suitable for writing an embedded compositor, as it does too much) doesn't make any of this easier. I've written more than 10k lines of code (so far), and, meanwhile, you can do an XEMBED implementation in a few hundred.
I get that X11 made certain things impossible. But why not build an "X12" rather than throwing everything out and staring completely from scratch? It seems like people just didn't feel like it, and building something new was more exciting and would look better on their resume. I of all people know it's not cool to tell people what open source software to build in their spare time (though, to be fair, many/most of the people involved in Wayland are employed or at least funded by corporations), but jeez, this has been exhausting.
It's really hard to overstate the impact of PKD's later works and JG Ballard. Having read about 50% of this list, I think all of our modern dystopia, paranoia, and existential SciFi sprouted from their insane seeds. And the depth of LeGuin's exploration of social, sexual, and economic identity in alien worlds sticks out like a sore thumb compared to her peers.
Damn, tho, the book cover for The Three Stigmata... is fascinating. I always pictured him as just a cyborg octogenarian but now I can't unsee this. I miss that artwork.
Aaaaand, it would be nice of the page author would put these in a CSV at the bottom so that I can make a checklist!
Create a folder on your computer or get a sturdy box made of good cardboard with a lid. Name the folder “Process”. Write the word “Process” on the box.
While working, occasionally take photos or screenshots of what you are doing showing your workspace, the computer desktop, the desk with pencils and papers and cables everywhere, the wall or piece of string with notes. Show the messy process of creating something.
Type notes on text files and save them with a name like yyyy-mm-dd-note-title.txt. Write notes on bits of paper and notebooks and journals with pencils and pens that you keep all around the places you spend most of your time in, including within arms-reach of where you sleep.
Practice writing down notes on a piece of paper in the dark, so you can do so when waking up in the night, before daybreak, to jot down thoughts and ideas from dreams.
Record messages and melodies using your pocket computer and remember to save these in your Process folder, too. You are looking for your voice.
Put these digital and physical notes in the Process folder and in the Process box.
Thank yourself later, in years to come.
You are what you observed. Experiences, memories, stories to be told. Put your marker on the map in time, that others may find and learn from.
KH-11 was cheaper than KH-9 by far ($3B vs $16B inflation adjusted, both for 20 satellites), each KH-9 had 4 re-entry vehicles, so you could only really do 4 passes per satellite before needing a new one (or some way to replace the module that housed the RVs).
It's likely that 'we might need to bring them back...' was justification for USAF remaining as a voice in the shuttle program after KH-11 rendered the satellite fleet reusable, though.
Afaik, CXL 3.0 will run on top of UCIe as well. Let's hope for an open standard that will enable SoCs to be composed from many compatible chiplets. Cache coherency and small granularity below page size is a key ingredient.
The tech industry's mistake was all collectively setting up shop in a small handful of trendy, dense, expensive cities and not investing in housing, transit, or otherwise doing anything to improve the collective quality of life of those cities.
The repercussion of that mistake is remote work.
I don't think he's wrong on principle - remote work is less dynamic and impactful than physical teams. But it's the "...and it has to be in San Francisco" part that reminds me of how out-of-touch these guys are. Are in-person meetings better? Yes. Are they worth commuting 3 hours every day through a city that hates you? No.
And it never got that ecosystem because of inscrutable mismanagement so scatterbrained and perplexing that I can't wait for the insider-tells-all book about this post-iPhone period at Microsoft that I assume will arrive in a decade or two.
Zune, Phone, Surface, Windows 8, Windows RT, Bing, Cortana, Continuum, Project Astoria, I'm sure I'm forgetting. Apple and Google execs must have been amused.
It's not about monetization - that can be done just fine with this third-party client as it calls the same APIs as the official mobile app (thus if the mobile app requires a paid account for a certain action, so will this).
This is about "engagement". There are a lot of oxygen wasters out there whose careers and paychecks depend on "engagement" metrics aka how much time has been collectively wasted wading through the cesspool that their software is. The annoyance and wasted time is the point, and an alternative client (or other way of automating it) goes against that.
People often talk about "bullshit jobs" around here, but what everyone overlooks (or refuses to acknowledge as it's uncomfortable) are all the bullshit jobs in the tech/software industry who derive their careers out of end-user annoyance and misery.
The PWA thing seems sick to me. Henry Ford listens to market research and rebuilds his car as a mechanized horse. For me, the UX is strictly worse in every way. (And I already knew how to put links to webpages on my home screens).
There does seem to be a browser display mode, but it's up to the app maker to decide for the user what mode the app will be in. Why?!
> Progressive Web Apps can run in various display modes determined by the display property in the web app manifest. Examples are fullscreen, standalone, minimal-ui, and browser.
It makes me so sad how much lesser a person a user of a PWA is. The utter lack of user agent, being cast to whatever is provided by the maker, is a horrifying loss. All to ape what felt to me like the descending losing old-guard technologies.