Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

A big pile of all things of kinds of things.

Big Project Complexities - has nothing to do with software business. It's an organization challenge which has been known way before anyone wrote anything remotely complex with software. See planning fallacy, optimism bias, read about how car and structural companies has done it. Look up "GM, Toyota and NUMMI" as an example.

Cross Team Dependencies. Again, this is an organizational challenge. Cognitive loads, poor communication structures based on industrial org designs and army-based leadership with control and command. Hierachies etc. Read Fred Brooks and Mel Conway on these for a more specific software approach.

Meetings, meetings, meetings. Of course there is a "going around it" solution for this. And it's called a manager. No manager should allow any team member to attend meetings if no thourough agenda is available, no actions for attendees, no proposed actions afterwards and someone who controls the meeting. Meetings will then turn into writing instead of verbal.



Programmers: Skip the damned meetings!

Also programmers: Who and why decided that we should be doing this thing this way?!

Or: Why is the other team doing things this way?!


The point is not in not having meetings, but in having useless and inconsequential meetings.

It's much like requiring that any computation should either return a value or produce an effect. Without that, it's just waste.


> but in having useless and inconsequential meetings.

only know to be useless after having had the meeting, and the decisions or discussion were either moot, already known, or completely expected.

So how do you know whether a meeting is required, before organizing and/or attending one?


A few tricks that can catch meeting that a more or less guaranteed to be useless (in IT):

* Does it have less than 10 people attending?

* It there an here an agenda?

* Is there more technical people than managers?

* Do you know why you've been asked to attend?

If you can answer "No" to any of those questions, you're going to have a bad time. Now you can have meetings where you answer yes to all of the above, and it's still useless, but the chances are lower.


I think only the first and last are really important. I've had very effective meetings where I was the only technical person and we didn't have a formal agenda. Often meetings with stakeholders where I wanted to know what I could do for them; what they needed. So I guess that's still an agenda, but more important is that I was in control and the meeting was small.

Being a bystander in a big meeting, that is the worst. If it doesn't affect me and my input is not needed, why am I there?


Don't you mean more than 10?

A meeting with more than 10 people tends to waste more people's time than a meeting with less than 10.


One rubric I use is "is there one or more questions to be decided on in this meeting?" If no, it's a pointless meeting.


The problem is that some people, engineers included, think that way more people should be involved in decisions than necessary.

One structural cause of that is the “nobody owns anything, everything is shared and employees are replaceable cogs” paradigm. Because then it’s suddenly a real problem that people do things a bit differently. You have to not only agree on what to do (the interface) but also the how (implementation). It would be the same problem in a band if you decided to rotate the instruments every time you jam.


I enjoy working at companies where things get written down so you don't have to be at the meeting, unless you specifically want to participate in the real-time discussion.


managers: why is this taking so long? I'm scheduling a daily two hour meeting and inviting everybody so I can yell at you for taking so long!


And said managers refuse to attend the daily standup.


>> No manager should allow any team member to attend meetings...

I say this as a manager: who do you think is scheduling many (or even most) of these meetings?


Indeed. That’s why the role of the scrum master exists – the thinking goes that because managers can’t be trusted to protect the team from unproductivity due to complicated conflicting interests, you need a "neutral party" to do it instead.


Great! Let's schedule a meeting to discuss your thoughts!


There is also a different way around this; Skip all the meeting and ship code into production, and iterate into good solutions to complex problems based on feedback


> Skip all the meeting and ship code into production

This works until you have an outage, disaster, or embarrasment of some kind, at which point everyone is banned from doing that without approval.

This leads to a kind of "trauma driven development"; when you suggest a change, you see a flash of fear in the eyes of people around you. If you punish people for improvements often enough, you don't get any improvements.


Imagine a surgeon would work like this: no prior diagnostics, no procedures to follow, just cut people open and take it from there!

Honestly, you shouldn't deploy into production without testing. Development and deployment needs to be tracked and scheduled. Because worst case, it has to be reversible. If you start deploying directly, without coordination with others (skip all the meetings, right?), you are begging for desaster.


There's always a good case for slowing down and adding more rules. You can even build a whole career out of it, and if you do it right you'll be harder to remove than a tick - simply ensure anything that might threaten you requires your approval and review.

Convincing everyone that a mistake deploying software is equivalent to a botched surgery, though, that's a good one! I've not run into that one before but I imagine you could get a lot of mileage out of that at the right company.


I work in aerospace, last time a company botched software hundreds of people died. Worse than botched surgery.

Others rely on whatever software is produced, co-workers, customers, users. I think those people deserve a good quality product. And that requires at least some rules and procedures. Unless, of course, it is a single dev thing. In that case you won't have middle management and performance reviews neither.


Obviously the degree of caution necessary depends on the nature of the problems you're solving. But your typical consumer app doesn't need the same degree of caution as aerospace, and using a broad brush to say "everyone must be as risk-adverse as the most risk-adverse company" isn't a great approach.


I'd be willing to bet that the most recent mistake during a deployment that an aerospace company made didn't result in many people dying. Maybe the last one you read about in the news, but that biases the selection a bit.

That said, if you're sending people to the moon, and are actually working on the system that sends them there rather than the website, feel free to have a more rigorous testing procedure. But if you aren't, consider maybe you don't need it.


An guess why the fuck ups killong people are so rare?

Not every industry has to follow aerospace, or life science, standards. Every company owes its customers a good product. And with complexity, product, organisation, requirements, that requires coordination, communication and rules.

Ignoring that, cowboy style, not going to meetings, deploying to production directly, not understanding requirements and blaming all of that on management is simply unprofessional.


I'm still not convinced yet that more meetings makes better software, but I appreciate that you have shared your position so passionately and I feel like I learned a lot.


You're lumping a lot of things together here. It's entirely possible to test changes and have reversible deployments with them being scheduled or really coordinated with anyone, and "tracking" deployments can be trivially automated as a log of git refs deployed without any real bureaucracy or human involvement.


ship code into production, and iterate into good solutions to complex problems based on feedback

It makes me sad that this is even considered a serious possibility. It implies that as an industry we have so successfully convinced our customers to accept and pay for junk - as if software being junk is somehow normal or inevitable - that making an effort to competently build a good product doesn't generate enough competitive advantage to be a reliable winning strategy.


Partially, I think, this is due to fast and ubiquotous internet. I remember the days when games were sold on physical media, played single player on a nost of the time offline computer. Same for other software, updates and patches were cumbersome and expensive. Hence, software had to be complete when shipped.

Fast forward, today everything can be patched and fixed OTA. It seems that possibility made it possible to ship pre-Beta stuff. That all those OTA updates are an additional revenue stream didn't help.


I am absolutely certain you're right. The forcing function to achieve a respectable level of quality before you ship isn't there in an always-online world.

The other big factor IMHO is the amount of speculative money that has poured into the industry in recent times. That has allowed a lot of businesses to survive for longer than they naturally could despite shipping junk and that in turn has taught users/customers that expecting better software is unrealistic. Even most unicorn tech businesses (FAANG etc.) still produce a lot of junk and little that is any good - relative to the almost unimaginable scale of resources they have available - because usually they reached that unicorn status by having one or at most a few massive successes that they can defend and that then subsidise all the bad decisions and junk products just as effectively as VC funding rounds.


Couldn't agree more. We can lament all day long so, it won't change, will it?

When it comes to stuff like that, I feel old!


I find it's good to work outside the bubble of modern web/SAAS development from time to time. It doesn't always pay as well but it does maintain a sense of perspective and an awareness of what is possible. Changing cultures is hard but working somewhere with a healthier culture can be easier.


I would argue that the reason companies ship junk is they can’t really afford the costs of slowing down and doing it right.

Even mentioning the true cost would shut down most deals.


That is certainly a popular claim but I question how true it really is. I find modern development processes with this emphasis are often comically inefficient. In particular a lot of people who are used to working under those processes seem to dramatically underestimate how much drag the rushed decisions and accumulating tech debt will add to their entire project and how quickly that drag effect will be felt.


I don’t think it is related to design decisions necessarily — I think it is more related to coding to the most common case when there are many more edge cases exposed when you scale the number of users.

I mostly witness hurried scenarios in software contracting where each contractor tries to do the bare minimum to meet the definition of demoable. They then leave the project and mark it as another success.


Limiting scope at first is reasonable and often necessary. It's the basis of the MVP idea that has stood the test of time.

I wouldn't describe a well-made product with limited scope as junk though. By junk I mean compromising quality and taking on excessive tech debt, not just starting small because you have to start somewhere.


Alternatively, skip all the meetings anyway and then fire your middle manager. Or, if that fails, go work for a company that values your time.


If you skip all the meetings, your middle manager will fire you and you will have to look for another company.


Oh no, what will become of my beautiful performance management reviews


You found a company of more than 5 people without those? Good for you!


Any company of under 200 with a middle management layer who enforces meeting attendence through the threat of firing and executes a formal performance review process should be considered de facto sociopathic imho.


Counter point: Any employee who refuses to go to meetings and coordinate with others is toxic.

Meeting attendence is important, because

- others need input or have relevant information to share

- activities have to be coordinated

- decisions have to be met and shared

Getting the balance right is trickey, just not showing up is simply not acceptable so.


All of those things are better done in writing so that:

- There’s a record of decisions and the thought process behind them

- When asked for input I can take time to consider a response and therefore give a higher quality reply

- I can refer back to conversations or search them when I inevitably forget some detail

- I don’t miss key information because I was out sick or had to take a bathroom break

- There’s a paper trail you can copy/paste when there’s a disagreement in understanding

- In meetings it can be frustrating to try to get a word in: there are often too many people or just one really chatty bastard who won’t stop

- It’s easier to scroll past chatter about weekend plans and sports than it is to sit through it in a meeting

The problem with meeting notes is they only reflect one person’s understanding, and often the person taking notes only has half an understanding anyway. Meeting notes are like JPEG compression set to the lowest possible quality.

I work with a team that spans +9h and +16h from me and it’s great because there are few meetings and lots of written communication. It’s very easy to search past communications to find old materials or decisions, and that’s not just hypothetical: I do it ALL the time.

We’ll use meetings, but for more tactical purposes: troubleshooting a specific problem, or doing a demo of some feature (although even then I would suggest demos are better as a recorded video).

Most objections are really something like “wah, I don’t want to read or write, it’s too haaaard.” Reading and writing were some of the earliest inventions in human civilization and everyone learns them starting in pre-school. It’s a low bar for a grown professional.


Now this is what I would call a professional approach to work.


Agree to disagree maybe. But consider trying an experiment - take notes so anyone who wasn't there for whatever reason, intentional or not, can read them later, and don't threaten to fire anyone or insult them as being toxic who didn't make it. If you end up in a meeting by yourself, just make a decision and write it down. I'd personally at least be very surprised if all of a sudden you were unable to coordinate activities, make decisions, or share information.


May I remind you that you started with

>> Alternatively, skip all the meetings anyway and then fire your middle manager. Or, if that fails, go work for a company that values your time.


Imho you'll find that a middle manager who spends his days in meetings by himself wishing he could fire the "toxic" people who didn't come is non-essential, but I'm happy to leave that journey of discovery as an exercise to the reader.


You uave to be a real pleasure to work with...


Thanks! Your kind words just made my whole day. I appreciate that even when people disagree they can be civil.


We are totally not on the same page, but in all seriousness, I love your sarcasm. So maybe we actually would get along rather well at work!


Not going to meetings and not coordinating with colleagues are two different things.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: