Reading this list made me really want to explain the basics of each of those items so as to clear up any misconceptions, or to confirm that, yes, he gets the gist of it.
When I try to learn about topics that I'm unfamiliar with, the hardest information to find is "Why is X important?" and "What is the most important thing to know about X?"
With those in hand, I'd be able to determine if a deep dive into the topic is useful for me or not.
>With those in hand, I'd be able to determine if a deep dive into the topic is useful for me or not.
I think that's the point: we all know (most of) the gist of it. But the practical experience isn't something you can prep for in advance. You never know what tech you'll need to use in the real world. The "deep dive" comes when you get hired and the employer needs that tech. I'm not sure the employer gets that.
I'd love to see lists like this from more high-profile programmers. What does John Carmack not know? What does Dan Bernstein not know? What does Bryan Cantrill not know?
Not exactly that, but when reading Coders at Work I was surprised to learn that a lot of big names in our field use "printf debugging" and don't like actual debugging tools.
As a distributed software developer, I have found debuggers less useful these days. If I need a debugger, it probably means I don't really know the code that I am working on.
Good IDEs with good static analysis, good unit test/integration test/e2e test coverage, a true understanding of the code base before changing it, all these avoid most bugs.
When the production hits a bug, good monitoring and logging can help root-causing. It is extremely difficult to debug and reproduce a bug in a distributed environment. Many bugs I have root-caused are purely based on monitoring and logging then show the proof with mind-execution of the code.
If you're working in a large legacy system where it's not feasible to fully know the code you're working on, I find that debuggers both help me find bugs quicker and improve my understanding of the code.
Not a big name myself, but I'm always surprised that this surprises people. I mostly use print statements, simply because a) every environment has them, b) there is much less (none) fiddling involved to set them up, and c) because debugging is rarely needed anyway, assuming you are working on your own / your team's code (and that you are half good). I think the last time I have used a debugger was on what I think would be a definition of spaghetti code - not proud of it, but I did learn not to do that. :)
This is interesting, I use the debugger a lot, in whatever language. I've debugged with CLI interfaces and I've used IDEs.
I don't want to claim this is superior to printf debugging, or whatever other technique. I just like seeing the code executing, and debugging some weird error can sometimes be an almost relaxing activity, as you just step through the code one by one and try to keep track of what's happening without needing to think more than 2 or 3 steps ahead.
That said, I'm also the guy who will happily put breakpoints into internal framework code (e.g. Spring Boot) so that I can understand why the framework doesn't do what it's supposed to do. Usually, I find the answer, although maybe in some cases it would have been more economical to use a workaround instead, but that would leave me dissatisfied...
I rarely use the debugger in highly async systems (e.g. node) because it's just a mess, but in C/C++/Go/Java where there actually are good debuggers and you tend to have chunks (e.g. entire functions) of synchronous code, I find it very, very helpful to step line-by-line through most nontrivial code at least once for any major code path. A long time ago, I had a sort-of mentor who, whenever he looked at new code, asked the author "have you single-stepped through it yet?" so I did it to humor him, but the number of times I caught errors was eye-opening, so to this day I still like to do it.
It's not only a matter of usage. Many good programmers will argue that "actual debugging tools" are mostly useless and that logging the execution of your program is a more productive use of your time. In some sense, time spent inside the debugger is lost forever. On the other hand, the time spend writing good logging code is really worth, and will be especially helpful to detect future bugs that you have not yet written.
I'm not a good programmer, but I can say very proudly that I've never used any debugger (except for assembly language, where the debugger is more like an interactive shell).
I’ve found that the more state and code paths to the bug, or the more confused I am about the bug the more likely I am to need a bunch of print statements. Debugging for me is a fine scalpel that works great when I think I precisely understand what is going on but my code appears to be doing something impossible. Starting stepping at the top of a program and just looking for what goes wrong is a usually a huge time sink.
Debuggers are in almost all cases a superset of print statements. However, you have to learn how to use them effectively since there’s so much you can do. I assure you that single stepping to the problem is not how to use a debugger to find problems ;)
No one has ever really used the debugger at any of the c++ shops I've worked at almost exclusively logging. No one even knew how to get the build system to produce debug symbols at my last job.
that's a feature not a bug. you see... in today's world if you need to attach the debugger to understand something you have already lost. why? because you should not be able to attach a debugger to a production system (if you can that is another can of worm). ideally the logging + metrics of what you are building is enough for you to understand what is going on.
there is one notable exception to the no debugger rule: if you are learning/exploring a new code base on your dev machine and/or when writing tests [to cover legacy code]. but apart from that, unless your work is really special, you should not need a debugger.
IME it depends on what kind of bug you are dealing with, and also what language and debugging tools you have available. Historically I used to do a lot of printf debugging. The other day I was helping a friend write some C++ code and I had a bug where only the first element in an array was matching a condition. Using the debugger in my IDE I set some breakpoints in a couple of places that seemed close to where the problem was. The advantage of the debugger is that it is fast and easy to see at each breakpoint what all of the defined variables and their values are. If you do it manually you could easily forget to print one of the variables that you ought to be paying attention to. In this case the debugger helped me figure out the problem within only a couple of minutes which was great because we were really pressured for time at the moment.
Furthermore, in response to what one the siblings said about creating logging infrastructure, in my own debugging I tend to not find much value in keeping a bunch of logging around. For me keeping logging around boggs me down. Typically what I will do if I do printf debugging is I add logging statements, hunt down the bug and then commit the fix and the added printfs because they add context but then I immediately make another commit where I remove the printfs. This way I have them in the version history should I need them, while at the same time avoiding extraneous log entries at later time.
And really I’ve found that over time I do printf debugging less and less.
I attribute this to mainly four things:
1. A friend of mine that encouraged me to use an actual debugger
2. Watching Jonathan Blow program on his Twitch stream and observing how proficiently and fast he is able to get the information that he needs by using the debugger
3. CLion. As much as I like the commandline, I never found gdb with the commandline interface to be a suitable tool for me. Whereas with CLion I get a good debugger that makes visible to me most of the information that I need while I am debugging
4. Through years of programming I have learned to recognize many of the types of bugs that I produce. Also, with experience I have learned to write code that is easier to reason about and to debug as opposed to in the beginning where I was trying to be “too clever” for my own good with the code that I wrote.
I think another of the sibling comments put it well about the debugger being like a scalpel. Trying to step through a whole program and following what’s going on would be slow, confusing and ultimately error prone. In order to effectively use a debugger I think you need to have a hypothesis about what is causing the bug and then set very specific breakpoints that you make use of in order to figure out where things are going wrong. From your initial observations you can set more breakpoints at a more coarse level (shallower call stack levels) and use those to observe what leads up to things going wrong.
Why would vocalizing that be dumb? It's actually not a matter of perceived capabilities, but of concrete results, and, in the case of those guys, there aren't plenty of room for discussion on this regard..
I love this list. My favorite thing about it is not only do I not know most of those things I don’t _want_ to know them.
My background is video games. I specialize in real-time 3D applications. I think web tech is an overcomplicated embarrassment of cruft built on cruft built on cruft.
Programming is a wide and diverse field. Webdev is the overwhelming majority of programming jobs. But you can have a very successful career not knowing _any_ of that stuff.
+1, one can't know everything. Knowing the unknowns is often easy to fix, either learn them or find someone knows them. My company usually review postmortems of outages in production, almost all of them I have reviewed are because of a code change/config change that the author/operator had no idea about what could go wrong.
I love that article. Again harsh reality is that on average interview for a developer in London they will ask you all those questions and much more. And if you get hired by any chance in fintech you will most probably spend majority of your time on meetings and working with Excel spreadsheet... never touching any code.
This is common. I think there are few people that are masters at everything. Instead you should focus on building a talent stack or a skills mosaic. (Choose whichever term you want). The point is you might not be a master in any one subject but you might be pretty good at a handful of things. That permutation is enough to make you valuable and unique.
I wonder if this is what contributes to the instability of most software though, especially SaaS.
People do just enough to get by and "get things done," which always looks great at the time it happens. And then things break, and no one knows why because no one has the depth of knowledge to investigate it. But it has to be fixed, so typically one or two people in the team are assigned to it and are miserable for days or weeks while they painstakingly try and learn about the things they should've known in the first place, while at the same time try and fix a system under a lot of pressure from management.
I wish the software industry rewarded knowledge, correctness and excellence rather than speed of execution.
There are way too many things involved in a software system these days for people to know all of it. We shouldn't expected people to be highly knowledgeable about all of a programming language, multiple frameworks, libraries, cloud providers, CI/CD, security, and who knows what else. Instead, those should be specializations and teams should be composed of multiple specialists.
I'm well aware that what I'm saying is a pipe dream. The industry at large is fine with the idea of deploying broken things and patching them later, no matter the cost to the health of engineers, so it's not gonna change anytime soon.
You're right. Just by looking at job posts you'll see requirements to know several languages, frontend frameworks, databases, enterprise services, and so on. The modern programmer is often a jack of all trades, but a master of none.
I'm building multi-million dollar applications for companies, and often it's with languages and frameworks I've never used before.
Totally agree up to "There are way too many things..." That seems like blowing a nuance out of proportion.
I think what we need is to stop having "flat" feel-good organizations where everyone gets a say, and similarly we need to stop having corrupt organizations where the most politically savvy person is at the top.
We need strong, competent technical leaders who truly know most languages, frameworks, paradigms, and we need to pay them well. They need to take responsibility for the whole project and architecture, ruthlessly simplify, and work with product to define what actual problem is so the team and code doesn't turn into Frankenstein's monster.
From a selfish perspective, I like the current system. It makes it easier to stand out. Understand where depth is lacking on your team and become an expert. Whenever problems occur you become the go-to. Pros and cons but mostly pros.
From the list, I would suggest to learn C, because a lot of stuff is written in it or at least uses its ABI. So understanding that will give you a lot of leverage.
Besides that, a deep dive into operating systems (yes, that means to get rid of MacOS if you use it) will give you a solid foundation to understand networking, containers and even more.
I do recommend learning VHDL/Verilog (it's like learning functional programming or prolog - a completely unique experience), but it really depends on what you're after in life. Churning out Java/Web CRUD apps is enough to earn a living, it is actually what most programmers do (and even though every such app is boring as hell, collectively it is highly important for the world).
It really does depend what you're after. I find working on CRUD apps quite interesting, because I care greatly about UX and understanding the people and business problems that motivate the tech.
Reading it I feel that creative problem solving is more important than simply knowing stuff. I wonder if knowing too much stuff is actually detrimental to exploring new ideas and trying them out.
The first time I spun up my own Electron app, I simultaneously had two feelings:
1. That took a lot of futzing around.
2. But way less futzing around than I expected for a fully cross-platform application.
I still think the idea of Electron (using modern web technologies to develop applications) has merit. It just needs a better implementation (i.e use native webviews rather than shipping an entire browser.)
YMMV and I know I’ll be judged a gatekeeper, but not knowing: networking, modern CSS, some familiarity with SASS, CORS, basic deployment / platforms, and graphics APIs (!) all seem like pretty gaping holes for a frontend engineer - you’ll most definitely need those when building actual products for the web.
I don't think you're a gatekeeper, I just don't see the relevance to the author.
He's a library designer and is working with a team maintaining and updating one of the most popular Javascript libraries in the world. He has a very deep background in that language which is one of the most important skills for his role.
I don't blame him for not knowing some of those things you listed more deeply. There's only so many hours a day to dig deep into a wide range of material. You still have to spend time going into other material that might be more relevant to the day to day including testing, managing people, communication with team members, planning, etc. Yeah sure I'd love to be more knowledgeable about CORS or networking, but I also want to be much stronger in Javascript, the JS library I'm working on, design patterns, deployment, etc.
Lastly, I'd rather be in Abramov's shoes where my work profoundly influenced web development than be all over the place with my 'expertise' and not actually be all that impactful with my time.
Seconded, also reading the list, it's likely he just delved deep enough to solve the issue at hand and then back to work.
I can relate to that a bit, especially CORS since I never bothered to see what was up with that at a deep level either. Honestly just used a module for Flask when needed to get it working there and set up a CORS proxy on a cloudflare worker for a different project and called it a day.
Maybe that's how it went with him too. I'd call this something like "Laze Driven Development"—learning just enough to solve the problem (like my experiences with bash scripts).
He did create a big chunk of React. Half of the Github issues I see for React have him involved. He might not be a founder of React, but he sure is a major React author.
But you can also apply JIT to learning as well. Depending on your work, you may come across a situation that needs these things anywhere on the continuum from often to never. As the OP is a Facebook employee working as a core React team member, not knowing all these things clearly hasn't been a huge blocker.
But similarly, he might well know these things to an above-average level, and just think that he doesn't.
"As the OP is a Facebook employee working as a core React team member, not knowing all these things clearly hasn't been a huge blocker."
In a practical sense that should be fine, given the level of specialization that a company like Facebook can have given its size and scope. But I wonder if Dan would make it through the average Facebook engineer interview without his Redux cred however.
He is also the rare person who is very honest about this kind of thing which is refreshing. For example he claims he only vaguely understands time complexity ("nested loops are bad").
I don't know exactly what point I'm making here, besides there can be a disconnect between a developer interview process and what people actually contribute. I think engineers should strive to be solid at the fundamentals, but at the same time there are very productive people who just learn as they go. Also, it takes a village so to speak, and over-optimizing for Comp Sci majors who drill LeetCode may not be the best long term choice for building teams either.
Now you got me wondering how people who got more well known creating open source projects go through hiring process at these big tech cos. I know creator of Homebrew still had to go through typical google interview.
I am talking about people like Jon Gjengset who got hired to work on Rust at AWS.
He was hired at the same conference at which he showcased Redux for the first time (as a byproduct of the content of his actual talk, which was time-travel debugging), so it's unlikely it directly gave him the cred necessary to get hired, since he was already in the hiring pipeline.
If your speciality is frontend JS engineering, why are deployment, platforms, and networking (unless you simply mean how to use fetch/XMLHttpRequest) large holes? I'd even question graphics APIs beyond very basic knowledge, because in my experience I've never needed to know anything about those beyond the basics. These things sound more like full stack, rather than frontend.
Yes, networking as in fetch/XHR, web sockets, HTTP, REST, protocols, performance and so on. How your frontend code runs. All I mentioned are strictly frontend skills, not anywhere near full-stack.
I know exactly who posted the article, thank you. I simply disagree that you can be a frontend engineer in the usual meaning it has without some level of those skills. Doesn’t mean anyone is less capable, deserving or cannot make a living that way.
But he is literally one of the most successful front end engineers of all time and lacks those skills. Your takeaway from the article is it’s antithesis.
I simply don't believe he lacks those skills. He says "Python. I feel bad about this one — I have worked with Python for several years at some point and I’ve never bothered to actually learn it."
He's skilled at JavaScript and C# and worked with Python for years, what's the betting he's as good as any typical Python developer, if not better? Pretty good, I'd think.
> "I struggle to read either LISP-inspired (like Clojure), Haskell-inspired (like Elm), or ML-inspired (like OCaml) code"
How many people don't know Haskell but can successfully struggle through reading it instead of having their eyes glaze over and completely refuse to try?
> "CORS. I dread these errors! I know I need to set up some headers to fix them but I’ve wasted hours here in the past."
"I don't know it - but I've also worked on it and know how to fix it and done so" - hmmmm. I suspect he is more capable than he's making out; politely talking himself down.
> I suspect he is more capable than he's making out; politely talking himself down.
Why?
If you are heads down working on core Javascript technologies, why do you step over and dig into learning more than the basics of Python? Likewise most of these languages/ skills.
Because I've seen what people who "don't know Python" post on the internet. They can't match parentheses, can't make code that runs at all, don't know the difference between a string and an int, can't distinguish function return from printing. They definitely aren't people who have skills in JavaScript, C#, know the basics of Big-O notation and have worked with Python for several years.
I'll accept his word that he doesn't know the details of Python module imports, that he isn't an expert, but surely there's a difference between "not expert at X" and "don't know X"?
> They can't match parentheses, can't make code that runs at all, don't know the difference between a string and an int, can't distinguish function return from printing.
You are commingling "Don't understand programming" with "Don't understand Python here. It's pretty clear that an experienced developer is going to understand parenthesis matching. But regardless of how many years of experience you have in Javascript, the first time you see a Python list comprehension you are likely to scratch your head for a bit. Python lambdas are also quite alien if you aren't familiar with them.
FWIW, I was a professional developer for 10+ years before I knew learned Big O notation. I know a fair number of junior and even senior developers who don't know Big O notation even now. I knew how to avoid Big O type problems long before I knew the terminology.
> "the first time you see a Python list comprehension you are likely to scratch your head for a bit. Python lambdas are also quite alien if you aren't familiar with them."
The first time, sure. After several years of working with Python? He does say that. "I have worked with Python for several years".
Maybe you should re-read the post your penned then. Because from my perspective, it sounds like you feel a highly recognized javascript developer isn't qualified to work as a front-end developer.
I think the point is that frontend engineering is a broad field, and everyone had gaps in their knowledge, especially specialists who focus on specific parts of the stack. It's possible to provide a ton of value on very complex frontends without doing much in the way of CSS, for example. I've been part of projects where the UI layer was handled in a dedicated library, and unless you wanted to be part of that team, you didn't have to understand much more than which React components to use when. Graphics are a specific subfield of frontend that isn't relevant to most apps, and it's possible to go very far without being exposed to these.
Well it makes perfect sense to me why React keeps reinventing the wheel every other version and why it's a hot mess. This video makes more sense now as to why they don't seem to have any direction https://www.youtube.com/watch?v=iRo18pUs61Q. Frontend web development shouldn't consume braincells and React reeks of shortsightedness.
That video is quite poor in my opinion. He comes so close to a bunch of very interesting questions like how can state management and effect management be done in a good, scalable way that maybe isn't tied to our view libraries? He then admits he has close to no opinion on the matter and suggests web components which is about a step above the jQuery scenario he described in the beginning.
The way I see it the advent of React was basically a rejection of the imperative MVC model that was prevalent with the likes of Backbone in favor of a more functional one. The first prototype of React was written in Standard ML. The evolution over the years with flux, redux and hooks was basically answering the questions that were arguably solved in MVC already but coming from the angle of declarative, immutable approaches. Where does state live? How do I handle asynchronous effects? How do I abstract that behavior from my views?
I don't think the community has answered those questions definitively yet and a lot of the interest seems to be in refining the view mechanics with the likes of Svelte rather than really getting to grips with how to best handle long lived application state and the effects that operate on it.
One thing I could see being the case in a couple of years, and in some ways the video does reach this conclusion, is that the reactive, stateful layer will be abstracted from the view and how you render your data will be almost like choosing a templating engine on the server. It could be React, Vue, Svele, Marko, Web Components, whatever. Those libraries and components will handle your dropdowns opening and closing and your tooltips showing but your real application state that distinguishes your product from another will live somewhere else and will be less subject to churn.
I have absolutely no clue what you're rambling about, but if frontend dev shouldn't consume braincells, you go on and code a complex frontend in vanilla JS and then serve that plate of hot spaghetti to the devs coming after you.
I noticed he didn't know much about Rust. And I really want to learn Rust. Except it seems the installation size is reprehensible. Is there a way to install a minimum version to get started with some fiddling about?
From the comments it appears this person has done some significant work in YetAnotherJavaScriptFramework.
Boasting that: Fuzzy on the details of how TCP/IP works (hello web developer, read a book!), does not understand order complexity specifically or algorithms in general - as one commenter pointed out - no wonder JS front ends are such shit if this is the level of intellectual heft that the authors have. Not knowing modern CSS - how can this person possibly work in any sort of cutting edge web development and not know about that?....
Specialisation is OK, but ignoring the general knowledge of how computing works, and then going on to write software used in people's critical systems is irresponsible.
Buy some books. Read them. Understand. It is not hard. But, yes, reading is harder than writing, listening is harder than talking, learning is harder than making stuff up and reinventing the wheel....
I don't think it was meant to impress. On the contrary I think the overarching point of the whole article was that it's okay to not know things outside of your domain that may seem trivial to other CS folks.
> Not knowing modern CSS - how can this person possibly work in any sort of cutting edge web development and not know about that?
Well, I don't know how he does it anymore than you do. But it looks like he's still working on cutting edge web development and it seems to be working out, so my takeaway is that he doesn't need to know about it plain and simple.
How can you say that when there's no evidence for it? Abramov is a successful, respected programmer. Redux is a major contribution to the web. This article came out two years ago, and he doesn't appear to have ruined his reputation. On what basis is it safe to say he "has" to know any of the stuff he freely admits to not knowing?
> Specialisation is OK, but ignoring the general knowledge of how computing works, and then going on to write software used in people's critical systems is irresponsible
If the knowledge is not relevant or applicable to what you are working on, what is the problem?
Knowing that things exist is more important than in-depth knowledge of those things.
This article really makes me feel good as a current intern and comp sci student. I thought I clearly defined what a junior dev could be if I worked at it for another year. Having read all the stuff this guy doesn't know and what I've learned at my internship, it's almost any hint of imposter syndrome I might of had went away.
Now I'm not saying he doesn't know how to program. I'm just saying it's very interesting to me how little you actually have to specialize in to make it in this field. What my internship has taught me, it really makes me feel secure job wise knowing I won't have difficulty being employed in the areas I know. I hate sounding so cocky but when I first started schooling again geared toward comp sci, I thought programmers knew just absurd amounts of stuff. Now I've found, they typically are just the average tech savvy individuals that weren't afraid to poke around in an OS.
Keep in mind technological knowledge is not the only factor. We pass on hiring plenty of technically competent people because of their communication skills. Knowing what you don’t know is one way to improve your communication skills.
Hang onto the impostor syndrome. Eventually, you come to accept that it seems like everyone around you knows things you don't, and then it keeps you pushing yourself.
I think different people can use those feelings differently. In my case, the thing you describe does work. It is stressful, but it gets me to be productive and have high standards (to the extent that I do, at least).
Don't be too quick to take it at face value. It's hard to write down what is actually in one's head, even for experienced writers. I'm sure he'll tell you a lot more on the stuff he claims he doesn't know if you had the good fortune of having a chat with him in person. Also, keep in mind that he probably knows deeply about some things and their extents, and so his definition of words like 'fuzzy' should be different from that of an intern's. As he puts it himself,
> Experienced developers have valuable expertise despite knowledge gaps
The older I am, the more experience I have, the harder i find landing a job. 10+ years of experience, multiple companies. Also, not that many techs/skills from earlier years are applicable now, it seems that there were couple of technology cycles since then, and more experienced persons may have no much advantage over younger persons.
If people you know do not know much, then the people you know do not know much, there are also people you do not know.
Sure, for example one time I was proficient in DOS low level programming, not a very useful skill now. Also, typically, when a language or technology gets any good, it already becomes obsoleted by something else (examples Delphi Pascal, Perl). What is fashionable is already on its way of being uncool. I guess Java and Python are next (very decent technologies). Though I believe there is a real progress in IT, GPT3 for example. But it happens through obsoleting older technologies. And I can't really say, that I'm better that younger programmers, because one time I knew something unusable today, I think the converse is true. There is no happy ending. I guess escort services are similar to IT (not much premium for older/experienced "workers"). I guess the knowledge that is current in IT year-by-year is called mathematics.
My idea is, that the type of IT knowledge is often of a particular trivia quiz kind. You may "learn" the next trivia quiz, but you'll not be necessarily any more wiser after 20 years (due to forgetting and obsolence) than after 5 years. I guess a painter, sculptor, doctor or lawyer might be significantly better with 20y exp than 5y.
There are devs with 10 years of experience that really have ten time one year of experience.
Some places teach IT as rote and magical knowledge. Others teach fundamentals. The first are effectively constantly re-learning and the later are improving. I always say I would rather hire someone with a good understanding of graphs and who never heard of git than someone who rote memorized git commands but has no clue what a graph is. Because the first one can pick up git after reading one or two tutorials.
no offense, but from this list the author is either 1) stuck in a particular domain or language or 2) not very proactive in broader learning or side projects.
I agree it’s ok to not program C or understand network/transport layer in depth. but things like unix shell basics, python, micro services, docker - these are all fundamentals I assume everyone (backend, frontend, mobile, or game engine) has working understanding of in order to be a proficient developer today.
nice that the author recognizes the areas they lack, and should commit to learning in 2021. happy to give good recommendations on books or online classes.
more curious to see this list authored from someone with more diverse experience.
Co-creator of redux, he was partly motivated to write the article to counter ideas you need to be fluent in everything, often propogated (intentionally or not) by "as a hiring manager...red flags" posts like yours.
On one hand I agree with his sentiment. But being a co-creator of redux is not really a good argument. I would probably flat-out reject members of the core redux team when hiring, considering redux's design.
It is not composable. I.e. you cannot use one "redux-application" within another one.
Some ideas of redux are great, but the execution could have been much better. I think lack of experience (maybe also related to Javascript) is the reason.
Why isn't possible to embed one Redux application into another? You can have multiple Redux stores so I don't see how Redux applications are unable to be composed.
Thank you for your link. I think the link captures it quite well. It says:
> These <SubApp>s will be completely independent. They won't share data or actions, and won't see or communicate with each other.
and
> This pattern is not recommended for parts of the same app that share data.
This is precisely where it gets interesting. Composition is important to combine things without these things knowing that they will be combined in advance and having to change them.
So imagine there exists a redux application that shows a dashboard which lists sales within a timeframe. Now I build a new redux application that wants to use two of the existing redux applications next to each other, using one to show sales for last year and one to show sales for this year, using the same timeframe (months/days) but for different years. This is a very very simple case of composition, but it becomes tricky fast.
Question: how do can I align the timeframes within the two sub-applications? I want to make it so that if the user changes the timeframe within one of the subapplications, it should translate to the other one and vice versa. Can I do this _without modifying the code of the sub-applications_?
Obviously the author is productive on a "real world business engineering team" considering they work for Facebook. Since you seem to have a lot to critique about the post and the author, maybe it's worth putting in that effort into reflection and self-criticality as well.
Why do side projects matter for a job? Learning, maybe, but they both point to "extracurricular" time spent programming, which shouldn't affect someone's hiring. I've been on hiring panels and have pushed specifically against that mentality, because I strongly believe what a person is interested in outside of their work shouldn't affect my decision to hire them, as long as they're competent during the interview.
I agree with that. it shouldn’t have to be extracurricular. in fact, having outside interests other than programming is a plus.
but there are plenty of ways to learn the general basics of modern computing on company dime. whether thru training, talking with peers, reviewing others code, or taking a step back and looking at architecture outside your specific domain.
if I’m speaking with a frontend engineer on my team who doesn’t understand how his code is deployed or basics on the compute resources used to run that code it’s likely not going to be a very productive discussion
That seems like a weird list. Python? Micro services? Docker?
I have only passing familiarity with Python and as for the two others, I just know what the tools are for, but have never touched either, and I am certainly a proficient developer. My workplace just happens to use other tools.
I am very proficient with the unix shell myself, but there are certainly competent developers where I work that are not.
When I try to learn about topics that I'm unfamiliar with, the hardest information to find is "Why is X important?" and "What is the most important thing to know about X?"
With those in hand, I'd be able to determine if a deep dive into the topic is useful for me or not.