Hacker News new | past | comments | ask | show | jobs | submit login
On Coding, Ego and Attention (josebrowne.com)
549 points by digitalmaster on June 16, 2020 | hide | past | favorite | 170 comments



I get and agree with what the author is saying here, but I also think a big part of this is that in software engineering, so much of what we do is ephemeral. If you're a carpenter you'll know if you're good or not. You'll be able to do stuff like frame a house, replace a door, etc. And then when someone asks you how long it will take to frame a house, how much it will cost, what supplies/staff you need and so on, you'll be able to say.

We've been doing CRUD in our industry for decades. How can we not just say "this is how you do CRUD, we're done w/ that now". We've been doing data serialization for decades now. How can we not just say "this is how you serialize"?

There are communities where this is the case. Why have we abandoned them? Why have we abandoned that knowledge and experience to reimplement things in language X or using platform Y?

We might not like to hear it, but my guess is it's a culture problem. They say the way to get ahead at Google is to build a new successful product. Is that the same thing we're doing? It's easier to get ahead by building a new Z framework than to become a core committer on X framework from 10 years ago? Are most X frameworks run by toxic communities? Is there something specific about software that means tenured projects become less and less useful/maintainable/understandable over time?

There's something in here that's specific to SWE. I don't know exactly what it is but, I think we should figure it out.


CRUD and data serialization is an incredibly broad and diverse field. It's like saying "we've been moving our hands and feet for centuries, how can we not just say: this is how we move our hands and feet?"

Software has so many more possibilities to explore than carpentry which is constrained by our current physical technology. It's far better to encourage engineers to explore these diverse possibilities than to encourage conformity and allegiance to some singular path that everyone is supposed to agree on and work towards. You would simply miss a lot of different innovations by grinding away on the same path. Communities that do so just stagnate.


This is my way of thinking: 99.99999% of applications out there still will store their CRUD into a standard relational DB and run on standard operating system with standard protocols.

Sure you can create a lot of fuss all around it, but I feel we create a lot of fuss because of ego, because we want to be perceived that we came up with new ways.

The reason to not conform is ego. Software is perhaps the cheapest ego boosting tool ever created.


Good. The only people who come up with new ways are the people who have the ego to try, and the world is richer for it. I'm glad the world is filled with engineers who try and fail and learn instead of those who would rather not create a fuss.


I think developing new ways to do CRUD is great but as an industry we take it too far.

I worked at an agency that produced CRUD apps at a rate you wouldn't believe. Every task was correctly estimated to the nearest hour. Add xyz entity 2hrs, add xyz frontend widget 3hrs, change deployment pipeline 4hrs etc. This was possible because they picked a tech stack and stuck with it.

I've also worked at companies where doing the same task could be 2 or 3 days. A place where no task can be estimated smaller than 1 day. The reason being the infrastructure, deployment pipeline, tech stack etc is overcomplicated. Way too much overhead.

Unless you are building some massive scalable solution all you need for BE is Spring/Django/.Net and an SQL server with a single backend dev who knows his stuff. Frontend you might need to change frameworks more often but still you can go a solid 2-3 years building momentum before needing to switch.


I feel your pain.

Especially on the .NET side.

A general history of CRUD in .NET:

- Basic ADO.NET (Not too different from JDBC/ODBC, direct commands)

- First Gen ORMs; Linq2Sql (functional but only on SQL server, and missing some features)

- Entity Framework (4-6) /NHibernate. Lots of people wound up hating this, so they went to

- Dapper. Dead simple; Takes SQL and only maps the results back. Everyone loves it.... Similar abstractions are created over Linq (linq2db, SqlFu) as well, with less (but happier) adoption.

- EF Core is released. Everyone switches back over again.

The whole thing is silly.


Yeah, all the churn costs more time and resources than it saves. I personally just stayed with Dapper, simple and flexible. I think people have a problem with judging tech based on any benefit rather than cost benefit analysis. People also value cuteness and elegance in doing 'common' tasks over conceptual simplicity and a similar degree of ugliness for all operations.


Yeah this is what I'm thinking. Yeah sometimes we need to figure out how "doors" work on the International Space Station, but 99.99999% of the time you buy a door kit from your hardware store and you're done. Same with serialization or CRUD or whatever, yeah maybe you do have really interesting requirements that are open research questions. But that's rare.

We're verging towards this, "No Code", PaaS, FaaS, Zapier, etc. I'd be super surprised if there were lots of CRUD jobs in the industry in 10 years.


In 10 years there will still be plenty of companies that never adopted "current" trends.


Eh, yeah that's a fair point. I wonder if starting at one of those companies will be like walking into one of those houses built by an eccentric after a while though.


Probably more like a house built 100 years ago. I bought a made-to-measure blind for my flat a few weeks ago. Followed the instructions, went to attach it to my window frame only to find out that my window frame bows so much that the metal bar won't actually attach to the wall. Stuff like this is rampant in non-modern build housing, not just eccentric built.


In houses upkeep matters more than age. 2 out of 3 buildings I lived in are about a 100 years old (not present in map surveyed in 1914, present with right house numbers on map surveyed inbetween 1920 and 1924), and my current flat is in a 75y old building. Reinforced concrete skeleton, and the rest is brick. Best flats I ever lived in, the brick structure dampens the sounds well, and the high ceilings/tall windows let in a bunch of natural light.


Humans have been constructing houses and doing maintenance on them for a few thousand years, but we've only been writing software for a few decades. We certainly didn't reach out current process for framing houses in the first few decades of carpentry.

That being said, I assume that the first few decades of carpentry didn't undergo as many changes as software has in its first few decades. My theory is that software changes so quickly because it can be bootstrapped. When framing a house, you can learn from the process so that you can make the next frame better by changing the process, but the output of that process (the house frame) doesn't directly affect the next time you attempt it. On the other hand, you can write software that invents an entirely new process for editing software (e.g. a compiler or interpreter), which then you can use to write software that might not have been possible before. You can then repeat this process with the new software, creating yet another paradigm for writing software, and so on. More generally, when a process produces a tool that can then be used for to do the process in a new way, the process will be able to evolve much more quickly than if updating the process can only be updated with the output from other processes.


> They say the way to get ahead at Google is to build a new successful product. Is that the same thing we're doing? It's easier to get ahead by building a new Z framework than to become a core committer on X framework from 10 years ago?

A Kurt Vonnegut quote comes to mind:

"Another flaw in the human character is that everybody wants to build and nobody wants to do maintenance."


I think the reality is that some people are actually just fine doing the maintenance - but they're unlikely to boost their career/paycheck by doing so comparable to what they'd have gotten from making a new thing instead. And that's an issue.

I'd love to go back to old code with the benefit of deeper domain knowledge and greater understanding of my tools and be able to make products even better. However, it's hard to square that against making +20% earnings by helping build a new chat app.


> some people are actually just fine doing the maintenance - but they're unlikely to boost their career/paycheck by doing so comparable to what they'd have gotten from making a new thing instead

Is that really the case? Forums like this look down on maintenance a lot. But I find that real world companies much less so.


Aside from cleaning and lubrication, a lot of "doing maintenance" is still throwing away old material and bringing in new. Just being selective about exactly which part is at the end of its duty cycle.

People talk about it like there's something wrong when, at any given time, a few microservices are being rewritten. But I would expect that for a sufficiently large machine, on any given day a few parts are being replaced.


Yes, but... Job security in this industry boils down to little more than evolve or die.


If someone wants a website that lists their company hours and has a contact form, that’s pretty much a known amount of hours for an experienced web dev. That’s about the equivalent of asking a carpenter to put in a door.

If someone wants a custom built order and inventory management system, that’s like asking a carpenter to build a custom 4 story house from some napkin sketches.

The whole reason computers are valuable is because they automate away all of the rote, repeated, predictable stuff. The unpredictable part of SWE is not comparable to carpentry, it’s more easily compared to architecture/engineering where the problem statements are vague and most of the job is getting agreements on what the thing will actually be. The carpentry part of programming is mostly predictable.


Agreed. Or in case someone brings up the good ol' "civil engineering" analogy - programming isn't like constructing a bridge. Constructing a bridge is what compilers do. Programming is the design and engineering that results in a blueprint. And our occupation is unique in that the construction part is so cheap, we can design iteratively, instead of actually thinking about what we're doing.


>There's something in here that's specific to SWE. I don't know exactly what it is but, I think we should figure it out.

It's changing requirements. When you build a house, people don't come in 6 months later and ask you if you could make one small change by placing some jet engines on the walls so the house can fly somewhere else during the summer. It's just a small change, right?

The problem is that in code, it often is a small change. Or at least, it is possible to make one quick adjustment to satisfy this new use-case. But often, these small changes are also a hack which doesn't fit into the previous overall design and which would've been implemented in a completely different way had the requirement been designed for in the first place. Now, one of these "small changes" don't tend to kill the product, but years or even decades do. That's why refactoring exists in software engineering, but not really in home building. Well, in some sense it does exist by renovating. But nobody thinks it's a good idea to completely renovate a house 25 times around an architecture that just doesn't work anymore for what it's being used for.

If you build a piece of software for exactly one well specified use case and only use it for that, it'll probably run really well forever. But (almost) nobody does that.


The differences between carpentry and software engineering is that the problem space in carpentry is much smaller and pretty much static over time. It's rare for carpentry tools to get an order of magnitude more powerful over the course of a decade, and for 100 people to work on the same carpentry project.


> The differences between carpentry and software engineering is that the problem space in carpentry is much smaller and pretty much static over time.

Anyone is free to compare software development with any engineering field, which typically have to solve large problems.

Thus if you feel carpentry is not a good comparison them look into civil engineering.

And no, the key factor is not the 'power' of software tools. The key factor is stuff like processes and standardization.

Sometimes it feels like software developers struggle or even completely oppose adopting and establishing processes and standards of doing things. Hell, the role of software architect is still in this very day and age a source of controversy, as is documenting past and future work.

We're talking about a field that officially adopted winging it as a best practice, and devised a way to pull all stakeholders into its vortex of natural consequences as a way to dilute accountability. The field of software developme t managed to pull it off with such mastery that even the document where the approach is specified is not a hard set of rules but a vague "manifesto" that totally shields their proponents from any responsibility of its practice delivering poor results.

If an entire field struggles with the development and adoption of tried and true solutions to recurrent problems then it isn't a surprise that basic problems like gathering requirements and planning is something that is still the bane of a whole domain.


People make the civil engineering comparison all the time, but is software development really that much less standardized?

What's standard in building a bridge? You have some physical constraints(length, what's the bridge crossing), material properties, environmental constraints(temperature, weather, wind, what soil are you building on), what kind of traffic. Then there are standard 'shapes'(though it's your choice of suspension or whatever). You then have a ton of standard tests that you run to check that the bridge is fit for purpose. But it's not like the bridge is built out of legos, and even if a lot of standard subcomponents are used the assembly will still end up being fairly unique due to every location being different.

Software does in fact have tons of standardization. No one thinks of processor arch when doing web dev. Or DB implementation. Or how you modify the dom(there are a handful of libraries to choose from, similar to a handful of bridge designs).

How do you make a CRUD app? You can do some arthouse project, or you can just use Rails or various Rails-like frameworks. They're all mostly equivalent.

How do you serialize data? JSON(before that XML, I guess). Yes, you can do something different, but you can also build an apartment building out of reclaimed wood.

The real uncertainty lies at the User Interface, which really isn't engineering driven, it's fashion and art and stylistic architecture. So yes, the way websites look tends to change and be fuzzy, but so do clothes and no one complains about that.

I think software people both overestimate the standardization of physical engineering and underestimate the complexity of physical engineers' decisions, presumably they're not just following a recipe.

TLDR: When software standardizes a tool or process it becomes invisible, an import and forget somewhere in the pipeline of tools we use. This makes it seem like there's a lot of churn. But the churn is a bit of froth on the top of really solid foundations. Yes we're always working in the churning part, but that's because the foundational part requires almost no interaction at all.


Ok, let's take another angle on this. The fundamental difference between software and most other engineering domains is that software doesn't involve physical matter (at least directly). The standards and design patterns in civil engineering, mechanical engineering, etc are driven by physical constraints. Whether it be monetary cost for constituent parts, or time cost for delivery, or just the limits of physics in general. Many of these limits are non-existent in software. There is no physical weight to a software object. A poor 10 year can make a million copies of it as easily as a rich software company.

Now there is software that tightly follows specs and standards, and you typically find it in critical systems, such as medical and aerospace. But there are orders of magnitude more software projects than non software engineering projects because they require so little to instantiate. There is almost no barrier to entry with software, and no BOM, and no supply chain.

Perhaps it would help to only call a subset of software projects as "engineering" - that would solve the problem. Not all software needs to be engineered. I don't need to engineer a script that downloads some videos for me or my personal website. And that's not a bad thing.


The availability of inexpensive CNC machines and 3D printers are carpentry tools that have certainly bolstered productivity in the last 10 years. Probably not “order of magnitude more powerful”, whatever that means, but as one very successful carpenter friend put it: “I don’t even fuck around with table saws anymore”.

By contrast, I’m still writing code more or less the same way I was 10 years ago, with mostly the same tools, and have not seen “order of magnitude” level of anything contributing to my productivity.


Basically, this and other comments show that the analogy completely breaks down. The scales, changes in scales, and degrees of freedom are just utterly different from anything physical humans build.


> It's easier to get ahead by building a new Z framework than to become a core committer on X framework from 10 years ago?

Sometimes, yes.

This particular angle is explained in the article

>> This is ego distraction in action. Self comparison determining effort. If we feel like we’re ahead we continue to put in the effort. If we feel like we’re not, we determine it’s not worth the effort.

The reason people would prefer working on a newer project/framework/whatever is that there is a higher chance they might be able to contribute meaningful code / support. I am admitting to that, and I am sure many have similar thoughts. It is purely guided on where one thinks success is achievable.

Also keep in mind - progress is being made. Python is clearly more productive that Perl. Or Django vs CGI/FastCGI. So 15 years ago, if that were my two choices for two projects, I would have taken the path of Python. Not just because it was new & shiny then.

Fast forward a decade, Go is clearly more productive than many things that came before. Kafka is clearly easier to manage than home-grown queues via databases and flat files. So why should I stick to old process?

The problem I feel is lack of arriving at any standards for anything basic. We have 10 message queues, but limited interoperability. We have 50 popular databases, but no easy migration. We don't even have universal support for Parque in all languages even though it has been around for a while. When can I grep a parque file? Something as simple as Azure blobstore and Amazon S3 can be linked together without arcane and inefficient copying.


One of the biggest difficulties of ego in software comes from the difficulty of finding "the ground".

New languages are popular. Why are they popular? "Because they are better." But in every other domain of software we also say "The best technology doesn't always win." Why would languages be any different? What if Go is, in fact, Worse is Better? And if it's a Worse is Better, then what is the Right Thing?

Ultimately, I think most programmers, given enough experience, eventually settle on a style and propel the style through the language, not the other way around. And to that end, there can always be new languages so long as there are styles of coding that remain unaddressed.

But this is counterbalanced by the assumption of a rationalist project existing: that code is made to be shared, and to be shared, it must be standardized.

If one looks at the hardware/software ecosystem, it is not rationalist in the slightest, though. It is a merciless field of battle where vendors manuever against each other to define, capture, control, and diminish standards. The small ones seek to carry a new standard; the large ones absorb the standard into their empire.

Software bloat is a result of this: everything must go through a compatibility layer, several times, to do anything. Nobody understands the systems they use. With each wave of fresh grads, another set of careers is launched, and they join in on the game and add more to the pile.

In that light, rational standards do not exist. They are simply the manifest ego of "us and them", and therefore are mostly a detriment for all the reasons that ego is a detriment.

There exist several examples of excellent feature scaling from small codebases: VPRI STEPS, various Forth, Lisp, and Smalltalk systems, project Oberon, and microkernels such as Minix. The quality they all share is an indifference to standards: they are sometimes used where convenient, but are not an object of boasting.

Therefore I currently believe that developers should think of standards as reference material, not ends in themselves - that is, you use one if you can't come up with a better way of getting the result.


Did you mean Apache Parquet? If not, what is Parque?


[flagged]



From my perspective, the root cause of this problem lies in lack of one common measure for code quality, correctness and usability or even programming in itself.

Say, there are two approaches for a problem - how do we decide which one we go with? In the last 10 years I have not seen a single case where the decision was made based on something other than subjective opinions of a person or a group of people. "Past experience", "this is how it's done here", "this is the only way I can do" and countless other reasons - all of those are subjective and cannot be used for objective comparison of approaches.

You could say, "days to implement" or "money spent" is such metric - but then, there are no reliable ways to mathematically calculate this measure for any code you plan to write and then prove it in advance.

To put it another way - there is no standard unit of code/system correctness, by which we could have measured what we are actually doing or plan to do. Until one emerges, we are bound to continuously reimplement same things over and over again, justifying it by nothing else than our projections, prejudices and boundless ego.


I agree it's a culture problem - we developers can't agree on anything even when someone else already went to the trouble of defining a standard. I also think there is another component which is inherently related to the software engineering profession: technology move fast and some things are indeed worth adopting because they are beneficial in the long run, even if it means reinventing the wheel or having to re-learn something from scratch. But understanding which is which it's not that simple. Every time I start a new project in the team, we need to learn a new way to deploy, to instrument the code for metrics, a new integration test framework, the new features of the the CI/CD pipeline which replaced the old ones, maybe a new framework or even a new language. This is even before writing any meaningful code. How much of the new stuff is an improvement, rather than just a slightly different flavor of the old stuff?


> Is there something specific about software that means tenured projects become less and less useful/maintainable/understandable over time?

Complexity. Understanding a legacy codebase is pretty much a small-scale research project. You need to gain domain knowledge, become familiar with the team, get acquainted with the codebase and its history, before you'll be able to reliably tell bad code from clever solutions to tough problems. The longer a codebase is developed, the more is there to learn and retain in your head. It very quickly becomes just too much, which means onboarding people takes a lot of time, and day-to-day development also involves being extra careful, or creating obscure bugs - both of which make the project take longer.

> They say the way to get ahead at Google is to build a new successful product. Is that the same thing we're doing? It's easier to get ahead by building a new Z framework than to become a core committer on X framework from 10 years ago?

Yes and no. Not every one of us plays the office politics. Some of us code because we like it. The yardstick then is one of personal growth, the ability to comprehend and build increasingly complex, powerful and beautiful systems, or automate mundane things faster and faster.

But, regardless of the "core drives", one thing is true: building a system from scratch is a much faster way to learn about the problem domain than trying to understand someone else's system and maybe contributing a patch somewhere. We learn by doing. That's why there's so many half-baked libraries for everything out there. Yes, there is ego involved - particularly for people who go out of their way to make their half-baked libraries seem production ready - but a big part of the picture is still that programmers learn by writing code and building systems.

(The difference from most other professions is that people there can build stuff xor share stuff - not both at the same time.)


I disagree that the cause of the problem is complexity stemming from size, and propose that the real issue is the industry's poor history of efficient documentation. Processes to efficiently create and read documents that describe large systems are rarely in place at most of the places I've seen. That's probably the biggest barrier to contributing code to old framework Y. It's just easier to develop framework Z. I agree with you that some things can be designed to reduce complexity, but ironically whenever something like this happens, someone from an older product will glean ideas of the new one and port some of those concepts over (potentially further proving that the big problem is the lack of resources for understanding)


Technology changes and user expectations change, and we need to adapt.

And it's not my area, but this seems to be true in construction as well? The building codes change, and available materials and components change, as do their relative prices. Maybe not as fast, but fast enough to make older books out of date.


Have to disagree with most of this. Technology changes and user expectations change, but there’s a missing link here to show that either of these really necessitates Yet Another Language/Framework, launching Yet Another Product/Service, or rebuilding things from the ground up. It’s a bit like a homeowner wanting an updated kitchen and a contractor telling them they need a whole new house for it to work, when really the contractor just prefers building flashy new houses for their portfolio over doing renovations on a budget.

Also, side note: with respect to carpentry, books from 50+ years ago on wood working techniques, framing, joinery, etc. are perfectly relevant today. And many of my grandfather’s tools are still in use in my workshop.


But "good carpentry" is primarily a judgement made based on of physics, with some haptics and design psychology and (hopefully not) entomology.

Humans are pretty good at physics. At the layer of abstraction where carpenters work, our predictive ability is solid.

What fields of science are the primary judges of "good software"?

> Programs must be written for people to read, and only incidentally for machines to execute > -- Harold Abelson

So it is pretty much _all_ psychology and cognitive science.

Humans are not yet that good at cognitive science because brains are complicated. There is real disagreement about how Working Memory operates -- and Working Memory is core to why modularity matters!


>We've been doing CRUD in our industry for decades. How can we not just say "this is how you do CRUD, we're done w/ that now"

As an analyst, can you explain this bit?

I keep hearing things like "that's not actually a software development job, just CRUD", "we're done with doing CRUD" etc. But it seems like between the application and the DBA all the CRUD is taken care of, wouldn't the developer just work on the application itself? And isn't saying "we don't do CRUD anymore" somewhat akin to saying "we don't do [+-*/] anymore"? How can you have persistent data without CRUD? I must be missing a piece of the puzzle in this discussion.


It's a reductive, dismissive way of thinking, like saying that everything is ones and zeros, or that we're just copying protobufs around.

The data that we manipulate has business meaning and there are consequences for the users that arise from how we model things. Consider the genre of articles like "Falsehoods Programmers Believe About Names" [1]. There is ridiculous complexity here, for those willing to see it, but some people get tired of it.

[1] https://www.kalzumeus.com/2010/06/17/falsehoods-programmers-...


Not OP, but my take: When people talk about "CRUD" in the way you describe, they're usually talking about one of two separate (but related) things.

The "it's not _actual_ development" framing is usually directed at applications which "only" allow users to perform basic actions on some data, basically UIs for manipulating a database. It is absolutely real development (in my view), but less sexy than AI/ML, big data, etc, etc.

You are correct that every application (with some sort of data persistence) needs CRUD. But how CRUD is implemented, for better or for worse, depends on the requirements of the application storing the data. For (most) relational databases, the low-level "how do I CRUD" is well defined: standard SQL queries. But if I use NoSQL, or flat files, or something; it changes.

The definition of CRUD also varies depending on the layer of abstraction within an application or the perspective of the user/developer. For example: from a DBA's perspective, CRUD is SQL queries. From a UI, CRUD might be a JSON API or GraphQL endpoint. From a server-side application, CRUD might be a specific ORM library.


Yeah, CRUD is a solved problem but you still have to do it.

Mapping state to the database is to web dev what applying paint to the canvas is to painting. It’s how you do it that counts. Saying otherwise is overly reductionist.

Frameworks exist that abstract CRUD away. But you end up sacrificing UX and / or flexibility.


Picking the right level and nature of abstraction for the problem at hand is something of an art. Too high and you'll straitjacket yourself. Too low and you'll spend most of your time maintaining ugly boilerplate.

One of the many reasons why CRUD is way harder than its reputation credits it with.


I suspect their point is more: if it's a solved problem, why do we keep making new ways to do it?


CRUD is looked down upon because it's time consuming and repetitive when you do it with poorly designed tools and because it's the most common role.

I think it's mostly a class thing though. Test automation is similarly looked down upon even though it is often much harder to do right than regular coding.

There is a definite pecking order when it comes to programmer roles and it's not necessarily related to difficulty (although it correlates very strongly with pay).


I remember reading an article that studied junior and senior devs and discovered that there was no way to get better at debugging. No matter how much experience someone had, their ability to problem solve was about the same.

I think that might have to do with this complexity, but also: software has so many ways of doing something, even within the same language -- and that gets permuted across, say, five different languages (Python, Rust, PhP...). It's impossible to say the "right" way to do it because there are multiple ways to achieve a valid result that's readable, AND there is a margin for disagreement between what is "readable".


I feel this needs better context, because besides not being able to prove a negative, debugging is so much beyond only the essential ability to "problem solve". And as an anecdote, I've certainly gotten significantly better at debugging with my experience among many aspects. For instance the ability to recognize a somewhat common bug based on it's symptoms is something that at least within a certain context, improves with experience and is at least to some degree "getting better at debugging"


I’d love a link to that article if you can find it. I wasn’t able to on my own.

I was just thinking today about how to teach someone to be better at debugging.


Well written article, well written response. Sometimes us humans think that perceived improvement of conditions = improved conditions. This is false, but as a business guy calling the shots, my goal is to do the thing on paper in front of me by the deadline whatever the costs. Combine that with a developer's creativity and you get a new framework.


I'd venture it's not so much the technique behind the individual layers but the understanding of the need for all the layers and their interactions and the best practices in given situations.

We're prone to tediously repeat the same conversations over and over and take the cosmetic approach rather than the fundamentals-first way of doing things.


i think it's all there if you commit to using tech more than 10 years old.



In my experience, the best way to insulate yourself from the pangs of imposter syndrome and unproductive self-doubt is with experience of past successes.

Early in my software engineering career I would constantly and painfully wonder if I was actually capable of fixing a certain bug or solving a new or difficult problem. But then after working hard on a solution, 99% of the time it would work out. After going through this process of debilitating self-doubt and eventual success over the course of years, it has become much more manageable.

I still sometimes panic when initially faced with a very difficult programming problem, but I can put those fears to rest much more easily by saying, "ok, I've solved hard problems before. I may not know how to solve this particular problem yet, but I feel confident that I will be able to figure it out just like I did in the past with difficult problems X, Y and Z."

At the risk of sounding pedantic, part of leaving the beginner phase — and the true value of experience — is developing a kind of armor against those feelings of inadequacy (of course you don't want this to go too far into feelings of overconfidence or an inability to reflect when things do go wrong).

I also think it's the responsibility of more senior engineers to recognize when a more junior teammate might be having those self-doubts and be empathetic while helping them build up their own successes.


> I still sometimes panic when initially faced with a very difficult programming problem, but I can put those fears to rest much more easily by saying, "ok, I've solved hard problems before. I may not know how to solve this particular problem yet, but I feel confident that I will be able to figure it out just like I did in the past with difficult problems X, Y and Z."

The following this just my experience, but it's a bit of an odd one! So I thought I'd share for fun :)

I have this too and I only have 1 year of work experience.

What helped for me doing a course where I needed to know:

- C

- X86

beforehand.

The course was about analyzing binaries and malware. I didn't know any C and almost no x86. I did the course as a challenge, but it to date has been the most difficult programming challenge of my life. Teaching yourself 2 prerequisites while following a normal course load at the same time, while feeling insecure and have a strong suspicion to not be intelligent enough was tough, for me.

I've worked at 3 companies in that little 1 year of experience (2 times as a freelancer) and it hasn't come close yet. I'm hoping where it finally gets tougher, but I've heard from people who actually are experienced full-stack devs for 4+ years that that course was way harder than anything they have ever done.

So long story short: do super hard courses. If they're not the hardest courses of your life, then it isn't hard enough.


Would you mind sharing which course you took please? It sounds very interesting.


It sounds like Offensive Security's OSCP and OSCE


calling it "x86" tells me you were not without any prior knowledge of assembly before :)


Try again, you will be surprised how things are more simple than it looks like. Practice, practice, until it clicks.

Give yourself some time in between too. It is all about the fun.


Self-doubt is just as human as fear, anxiety, or loneliness. It helps when there's some social environment that helps counteract these bouts. By the same token, if the existing environment works to enforce these demotivating feelings, then one needs to acknowledge that it maybe "them", not you, and either build some mental defences or just move on.

Everyone one on the team should be allowed to make mistakes or take bad choices. Juniors, seniors, managers... we are all people, we're all engineers, we're a team! Cult of perfection is limiting to everyone. The manager's job is to recognize everyone's contribution, no matter how small.

Too often, the teams project a higher bar than is actually reachable. Sure it will lead to sense of inadequacy, for absolutely baseless reasons.


I feel like for me, reading and writing articles like this are a major source of distractions. I often find myself reading articles like this, lessons of self-improvement and tips of motivation and ways to be a better programmer, instead of doing what actually makes me a better programmer (actually programming). To extend upon the article, I feel like one of the easiest distractions from self improvement is constantly reading about self improvement. Not to say that the lessons in these articles are a sham, but that there's a point where the idea of and dream of improving yourself becomes a dangerously stealthy distraction.


The thing about working on yourself is that it’s actually work. Reading an article, or a book on behaviour, self-improvement and what else doesn’t actually change you any more than reading Harry Potter does.

It’s the years of applying Zen Buddhism, scheduling your chores or staring at the mirror telling yourself you’re a great person that changes you.

I know because I recently recorded from a major depression and anxiety, and everything that I’ve done that has actually helped, like lying to myself in the mirror, or convincing myself no-one on the train was actually judging me, took 6+ months to have a real lasting effect.

It’s the same with distractions. Just look at your screen time spent on your smartphone today. It’s probably a couple of hours by the time you go to bed. Like it is for the rest of us. Most of that time is frankly wasted, you know it. I know it. But reading a self-improvement article about how cutting down screen time is healthy for us isn’t actually going to change our behaviour one bit. Maybe for a day or two, but not next week and certainly not next month.


"The thing about working on yourself is that it’s actually work. Reading an article, or a book on behaviour, self-improvement and what else doesn’t actually change you any more than reading Harry Potter does."

This is well put, and I think part of the reason so much self-improvement material is drivel. Generally, I've noticed that some of the most pathological people are the most into 'self-improvement' as an idea. That being said, their brand of 'self-improvement' generally does not extend beyond reading and quoting books by various gurus.

On the flip-side, those I've met who are actually highly motivated and disciplined, have never picked up one of those guru books.

Reading up on something is one thing, and in many cases, it's an important first step. There's no way to start using a new language without reading something. That being said, simply reading is not enough. On top of that, what you read has to be actionable. The self-improvement platitudes are not actionable. Reading a book on Python does not turn you into a python developer. Why should reading a guru book turn you into one?


> I've noticed that some of the most pathological people are the most into 'self-improvement' as an idea.

I've noticed that some of the people who spend the most time paying attention to their blood sugar are diabetic.


My point is more that they are checking their blood sugar levels without doing anything about it.


hard things are hard.


The way I look at it, information is a force multiplier on action. If there's no action then 50 times zero is still zero.


But then if there’s no information, zero times 50 is still zero.


I'm not sure I agree, because the situation is in my opinion not symmetrical.

Work without reading about it: might work, and often does. Reading about it without doing the work: way less useful.


My view is the opposite, basically the adage “measure twice, cut once.”

Work without research often is actively harmful in addition to failing and wasting time & resources. Research at least improves knowledge while not wasting other resources besides time.


In science, engineering and coding, I agree!

However, I felt the context of this conversation was self-improvement though. In this particular context, it's easier to get things done without reading any motivational books/articles (in fact, most people get things done without reading about how to self motivate), and the contrary -- reading self-improvement articles -- doesn't mean anything if you don't do the actual work.

Let me quote the initial post of this subthread, which is the sentiment I agree with:

> "The thing about working on yourself is that it’s actually work. Reading an article, or a book on behaviour, self-improvement and what else doesn’t actually change you any more than reading Harry Potter does."


I think it’s true no matter the domain. You should take a scientific approach to self improvement.


Spend a week in the lab to save a day in the library?


I should have been clearer: I was talking about motivational or self-improvement sort of books and articles, as is the topic of TFA.

I wouldn't try it with lab work, though pioneering work sometimes was that way ;)


I was mostly being facetious, thanks for taking it so well :)


Worked for Edison


Once you've read enough philosophy and psychology (both academic and pop), articles like this start becoming repetitive. You realize Jean Beaudrillard was right, and in the post-modern world there truly is nothing new under the sun, everything is just a variation of a variation of a variation of something that came before. A better use of your (professional development) time is thinking about and writing actual code.


“The safest general characterization of the European philosophical tradition is that it consists of a series of footnotes to Plato” - Alfred North Whitehead


Programming isn't necessarily the best way to get better at programming, there are a bunch of related skills that also need to be improved. Thinking about how you program and behave as a human will unlock many doors


Programming is certainly the best way to get better at programming. Some other things may help improve your ability to program, but there is no substitute for the act itself.


Nah, I am agreement with the poster that you replied to.

Our tech lead is a bit of a diva. He is smart but basically he just programs and doesn't bother with much else. He bangs out code quickly, but it can be buggy and its usually the rest of the team that fix the bugs, keep the infrastructure running, write the tests. He is good at tricky algorithmic stuff. His code is fairly well organised. I don't find his abstractions particularly good. The REST API he created is terrible (poor abstraction) and not RESTful a lot of it uses POST requests, 200 success contains errors. No tests. Terrible at explaining his work to other people. Poor at listening.

Give me a good team player with average ability over a good programmer that lacks the other skills any day.


Agree absolutely. I think real problem is our interview process. More focus is given on solving tricky questions than overall craft. In day to day tasks, how many times you have to implement those algorithms ? (I am not against knowing algorithms though). Good code is which performs requirement perfectly and MAINTAINABLE.

Python's PEP 20/zen of python is one of the best guide for craft, imo. It works well for individual programmer as well as for teams.

https://www.python.org/dev/peps/pep-0020/


No.

Programming -with intent- is the best way to get better.

If I code up a 10k LOC main.cpp with stringly typed data structures, I'm not really better at programming, am I?

It's like that saying: practice doesn't make perfect, perfect practice makes perfect.

Programming is not literally just typing, as we all know, nor is it simply getting a Thing to compile. A lot of it is educating oneself on different types of data structures, algorithms, math, architectural practices, and so on. Expanding our workbench of tools, as it were.

And -then- putting that into practice when actually programming.

I'm not good at Rust merely because I've worked with Rust a lot; I've also read books on Rust, and I've read many web articles on Rust (found from Rust Weekly) and various libraries, etc.


Wholeheartedly agree. Programming is a vague term anyway. Speaking from experience, I could only code basic JavaScript with almost 0 understanding of the more complex fundamentals, but actually build functional SPAs. Repeat that a few years and I've only gotten better at creating apps via programming, but learned little in terms of fundamentals - and so trying to parse some debate on JavaScript would end up going over my head every time.


> If I code up a 10k LOC main.cpp with stringly typed data structures, I'm not really better at programming, am I?

I think you are, you are better having done that than before. It might not have been the best improvement you could have gotten out of it, but still.

The relevant XKCD is this one: https://xkcd.com/1414/


An unresolved mental dilemma is a part of a lot of my worry and inaction.

Basically it goes: If you just keep working, will you always keep making progress towards where you want to be?

I don't necessarily mean "if you just put in the effort then you'll succeed," which I do not believe in. People talk about "practice with purpose." You have to know the parts that you need to improve on and correct them if your actual intention is to get better at something. I believe that works better than taking any arbitrary action at all, with the same goal in mind.

So it's not knowing if writing that 10K LoC program actually does help or not. I forget things I've done. I lose interest.

Then I extrapolate from this and think, then there must be some spectrum of things in between that are not practically useful, and if I keep doing them then I will not improve in the ways that I want. I will believe that maybe writing a stringly typed C++ application is just reinforcing bad habits that I will have to expend extra effort to undo later. I then believe if that's the case then I ought to not do that thing at all if I believe it's just going to hinder my progress.

The problem is that this mindset costs me a lot of my action, because I figure if what I'm doing is not beneficial for my skills then I'd better get something else. A lot of the time that "something else" is something less challenging, all the way to nothing productive at all. So I end up believing I'm just coddling myself in an attempt to avoid "wasting time" not really improving.

I think this kind of fallacy stems from a fear of banging my head into a wall expecting to get better at some point without knowing if I'm actually on the right track. At least if someone knowledgeable teaches you they could suggest so. And that fear stems from placing too much value on intellectual success as opposed to enjoying the process. If you only enjoy something on the condition you improve, then it discourages you. I've been discouraged a lot.

It could also be due to divorcing enjoyment of something from improving at it. I simply always care about improving, and if I don't see improvement then I'll lose interest. But some say that people who enjoy things just improve on the basis of doing it at all. I just can't seem to get myself to believe it, though.


1. For each activity that you want to improve in know exactly why you want to improve. If it’s because you hope it’ll be fun later but it’s not fun now you should probably stop. There will be a time it’s less fun than it was in the beginning and you’ll give up then anyway. If it’s to get to do something different on the other side, stop expecting enjoyment and just pay the price.

2. Bias towards action. If you want to start running, just go run. Don’t read about it. Don’t sign up for a race. Don’t buy better shoes. Just go run for a while (or write some code or say all the Spanish words you already know out loud).

3. Spend 10-20% of your training time (do not go outside of this range) on improving your training. This is when you watch that video about your activity. People naturally gravitate towards 0 or 100% of time in planning. “A little bit” is the best but rarely done.

4. Check in with someone better than you on a regular schedule to make sure your training is progressing well. Weekly is very good. This could be a coach, mentor, partner, something like that (not an accountability buddy).


I think this is a pretty deep statement and I think it reflects the way I feel as well, and probably many others.

It's because we lack confidence in knowing if our system of improvement is going to work and we don't want to waste time.

I think you kind of need to Let Go and enjoy exploring or maybe just take structured online classes that you pay for.


Applying self-improvement takes time, work and effort.

You are applying it when you change as a result of reading and specially using what you learn. Change is essential, if you do not change anything you are not using it.

It is easier to just (passively) read something than applying it. The problem with reading(or watching videos) is that it can be used as an excuse for procrastination as it is way easier doing something passive that active.

The most interesting thing is that the problem is not in reading. I worked with a kid whose parents were worried as he used videogames to procastinate. They put the console out of the kid's reach and now the kid will just stare at the wall for hours just daydreamming.

So my advice is for you to start applying what one book about procastination says. Select just one good book and start applying it on your life.

It is very important that you just decide and pick one. I don't know "The Now habit" for example.

Write down in a journal the difficulties you face, your emotions while doing so and work over it consistently.


It's ironic that the one of the best forms of self improvement is to get over yourself, while the act of self improvement is often entirely self focused.

I think, though, that the author makes a really valuable core point: Most challenges are hard not because of the subject but because of our approach and perspectives. I can't think of anything important in life that doesn't benefit from the exploration of metacognition.


Big tired after an intense day, and I haven't really sat down and digested the article, but it seems close enough that I think my - admittedly primitive - tip can be of relevance.

If you get stuck, you tell yourself in whatever way you want, and honestly, some version of the following: "I don't understand this thing that is happening, but I know there is a cause. It does not happen without cause."

Honestly, it's a bit odd, and I don't know if that's the best way to express it in English. Nevertheless, several people have some back to me and told me that it has helped them.

My initial inspiration, and hypothesis is that the simple acknowledgement that I don't understand the problem, and that the problem still - despite my lack of understanding - still follow the laws of cause and effect, somehow temporarily halts our brains tendency to protect our ego at almost any cost, logic be damned.

I started trying this out after puzzling about why it's unreasonably common to figure out the answer to something only moments after you get up from your desk to go ask someone else for help, even when you might have worked with it for hours. It had to have a reason, although I don't know exactly what it is!


> I started trying this out after puzzling about why it's unreasonably common to figure out the answer to something only moments after you get up from your desk to go ask someone else for help, even when you might have worked with it for hours. It had to have a reason, although I don't know exactly what it is!

Well you've heard the advice on looking at a problem from a different point of view, right? Usually this is intended in the sense of changing the context or reframing the problem, and it works, but takes effort because we all have our default go-to mental models. But it turns out that changing the mode of your thinking (eg. visual vs. kinesthetic, etc.) is just as helpful, and the act of trying to phrase the problem verbally is usually just different enough from just thinking about it (I believe even if you are mostly a verbal thinker) to do the same trick.

Hence "rubber duck debugging" where you solve the problem by describing it to a rubber duck rather than another human.


> If you get stuck, you tell yourself in whatever way you want, and honestly, some version of the following: "I don't understand this thing that is happening, but I know there is a cause. It does not happen without cause."

Seconding this, a quote I have from a past computer science teacher of mine is: "Someone with a brain wrote this code, so you - as someone with a brain - can understand it". Definitely helps me when I'm really stuck on a problem.


The way I first heard it was "There's no magic in the world."

Every effect has a cause, and when debugging your job is to know your codebase well enough to be able to quickly pinpoint that cause.


Love this! I know exactly what you mean. Thanks for sharing!


I love the introspection and positive attitude in this post.

One thing I would add is that intrinsic motivation seems to be framed and activated very differently depending on one's personality. I find myself performing best when I'm on the edge of failure, trying to catch up to the high performers, and when recalling past times when I overcame failure or adversity. Comparing myself to the group described as demotivating in the post is the best motivator for me. And then there are little tweaks to one's environment (for me it's coffee, exercise, occasional travel, specific movies and music) that I find end up making an enormous difference in motivation, focus, and overall mental state. I suspect this has a lot to do with personal physiology and the environment in which you grew up.

With that caveat, the post is incredibly thoughtful and helpful, and I really enjoyed reading it.


The part about problems being either fun challenges or a nightmare really resonated with me. That's why pair programming is so important in my opinion, if you work too long in a silo the magic, fun, craft whatever of programming fades away. Just watching a coworker code or talking through problems with one can bring back that spirit of fun challenges though. Engineer morale is super important.


Agreed, sometimes a second pair of eyes or rubber duck programming does wonders


100% Agree.


I read this, and I have also read "The Practicing Mind" which is exactly what this post is about. My issue isn't in understanding the premise. It all makes sense, and I get a sense of "ah hah!" every time I read it (I've read it twice).

The issue for me is that I really struggle to turn this theory in to effective practice. Each time after reading "The Practicing Mind" I have tried to cognitively remind myself whenever I was frustrated, to stop and look at the problem as a beginner would, to drop my ego, etc.

The problem is that it would sort of help, temporarily. I'd find myself a little bit better at getting a solid day of work done, but not dramatically better. After a week or so, I'd forget to even do the exercises, and I'd be back to struggling.

What honestly helps more than anything, the "magic bullet" really is pharmacology (aderall). For me, it somehow calms me down. I don't feel more energy, I feel tranquil, and able to let defeat roll off my shoulders.

Sadly, taking aderall is not a sustainable solution. Amphetamine is a neurotoxin which raises blood pressure. Not to mention, I don't like being "tranquil" for anything other than my work. I like my 'normal' state of semi-uncontrolled energy, which is great for exercising and video games. I'd like to be able to turn this feeling on or off, and taking a medication doesn't allow for this.

So I tend to see saw between three states... 1) Struggling at work, barely getting by, quality of life sucks. 2) On medication, happy at work, feeling productive and peaceful, but desire to get off medication 3) Off medication, using "Beginners Mind" but find my ability to implement it in a way that is strongly effective, absent.


Thanks for sharing this perspective! I know exactly what you're talking about. I wrote this piece and have to admit that even for me I have days when I struggle to bring this attitude to a problem or a day. I guess it's one of those hard habits to break.

What helps for me the most is intentionality. To literally set my intention for a day or for a problem right before I jump in. So if I know I'm about to jump into a tricky problem I literally take a few seconds to remind myself of the attitude I want to bring and even exactly what I want to focus on.

So this would be things like "Don't try to judge difficulty (easy/hard), just go wherever it takes me" or "Don't be afraid of the amount of work". One that super helpful for me is deliberately separating to "understanding" part of a problem from the "solving" - so i'd tell myself "I'm just trying to understand what's going on right now - solve later". Etc etc.

Hope this helps.


Funny you mention Adderall, as I just took one today to help be more productive after a long, exhaustive weekend out of town. I totally agree what you say about how it affects work in a positive way, but that it isn't a sustainable solution. I believe the key is moderation. I won't allow myself to use one more than once a week at most, which I feel does a good job of keeping any sort of psychological dependence on it away, minimizes the negative physical repercussions it could cause, and to keep the feeling fresh.


I'm guessing that you've already heard this from like 10 people, but have you experimented with a lower dose or other ADD meds? I have heard that methylphenidate and vyvanse have weaker effects on mood while still helping calm down distractions. Their effect is bit less noticeable in general.

As always ymmv, and good luck :)


This post did not reach the conclusion I was expecting based on the title. For me, I think I've largely experienced the opposite relationship between ego and my programming productivity.

Learning to program as a kid was probably one of the most exciting developments in my life up to that point, and I expect that's true for many people on this forum. I originally attributed this to programming's usefulness, and the mathematical beauty of watching all the pieces fall into place when solving a problem. And those were surely both important motivators, but, looking back, the primary motivator was the pure power trip of it. Programming is extremely powerful (software is eating the world, after all), and I could immediately sense that, and that power was the biggest high I got from it.

Throughout my teens and twenties, I didn't really consider this, and just followed the high, and it led me to develop skills and a successful career as a programmer. For me, it was a positive feedback loop, where the more I put into programming, the better I got, and the bigger the ego boost. Unfortunately, though unsurprisingly, it got to a point where my inflated ego started getting in the way of my personal relationships, and even my self perception. I considered myself a great programmer, but not a very good person. I became quite self-loathing for many years, but I've noticed that's healed up after moving away from programming as a primary job responsibility, and my personal relationships have benefited, too.

I still love programming for the beauty of it, and I still dive into little personal programming projects a few times a year. Part of me wishes I did so more often, but I'm held back because the only way I've found to get through a project of any duration longer than a few days is to basically develop delusions of grandeur about it. Programming is fun and beautiful, but very hard, too, and somehow without the promise of the conference talk, or the influential git repo coming out of it, there's just too much friction. So, more often than not, these days, I simply don't bother. I guess with my current middle-aged testosterone levels, I'd rather keep my family and friends than be king of the world.

(That said, if anyone out there finds this relatable, but has been able to push through and develop a healthier, less ego-reliant, relationship with programming, I'd live to hear about it!)


This really resonated with me.

Before getting into programming, I was a somewhat accomplished guitar player. By the time I was 20, I had played in a bunch of bands, recorded several albums, and gone on tour. As a result of these early successes, I developed a big ego about myself as a musician.

I realize now that the main thing driving my musical career was that ego. I enjoyed playing, but getting better at my craft was not my primary driver. Instead, it was that I wanted to be famous and rich and noteworthy and desirable. For me, playing guitar was inexorably linked with becoming a certain kind of person and gaining status.

Now any time I pick the guitar back up for more than a day or two, I quickly get lost in delusions of grandeur. I start thinking about how I'm going to change my whole lifestyle to "be a great guitar player" and playing itself takes the back seat to fantasizing about gaining power and status. Try as I might I can't just casually play guitar for its own sake—kind of like how you have trouble programming without the promise of a conference talk or an influential git repo coming out of it.

For me the solution has been to avoid playing music, and to focus on programming (and my family/friends) instead. I think the groove of ego I carved out as a guitarist is just too deep to allow me a healthier relationship to music. As a programmer, I don't have that same narcissistic false-self to live up to. I just enjoy it and want to get better because it's fun.

Maybe the solution for you could be to take up a creative pursuit other than programming?


I feel that to be something that resonates with me. I started as self thought freelancer loved the symphony of creation and the end product you make. Then I got myself a job I had sucha enthusiasm and thirst for knowledge, I would go above and beyond. Then I burned out, the cycle has repeated couple of times now forcing me to take breaks in 7-8months. One thing I realised I still love creating stuff, but it's all too painful to do it for someone else. The major contributor to it is constantly changing of requirements, sometimes goals altogether scrapping of projects you put lot of brain power in and finally sometimes fighting against the tide. I have experienced people who are just there in the middle management adding unnecessary layer of red tape and doing anything to survive. I feel I'm done with it. I have picked some other stuff, currently searching for something other than a programming job, it's a risk because majority of my work has been programming, other than a failed startup. But I think I will take the risk of exploring.


Software developer jobs pay so well, many programmers don't know what to do other than that. I feel need of disengagement with laptop so many times but damn I don't know any other thing to pay bills. Also there is always fear of losing track of new tech and being jobless.


So, I've not really had that set of things, but when I went through a phase of working in fast food, that was helpful for helping me disentangle who I was from my ability to work as a software developer.

I suspect that sort of thing won't be much help to you, however, because for me, my motivations have been relational more than power-based.

That, and I find enough joy from programming just from the utility of things I've worked on, rather than needing it to be a vehicle for influence.

Still, I'm very interested to see where things go for you.


> My ego creates a tight bond between my work and my identity. Linking my self worth to how well I do my job. This then creates the need to track my performance. To keep score. Spinning up mental processes that consume valuable resources which make staying on task very difficult.

I think this part is definitely true for me...

I have let "notice when you are confused" and "understand the impact of your work" and "make sure you are building the right thing" and "make sure you know stakeholder needs" get kinda etched into my identity. I keep wanting to _understand_ the systems I work with and I keep getting distracted by noticing problems with its UX or implications to business process.

I can turn that voice off with deliberate effort, but I don't know how to get it to stay off.

Does anyone else have any methods for more permanently-silencing UX-worries and just cranking out code?


No. I was unable to silence those thoughts and became a product manager, and am now a UX practitioner with strong opinions on software dev which I also can’t seem to silence (speaking of ego).


To be clear, i'm not suggesting you stop thinking about things like "understanding the impact of your work". In fact that's exactly what I think you should be thinking about. That's the "the" type question. It's only when you start to think thoughts like "Am I doing a good job at understanding the impact" that we get away from the goal. "I" am at the center of that question, instead of the "the impact of your work"


I am not convinced by the article. For one, I don't think beginners mind is necessarily superior to expert mind. After all, why would society value the expert more than the novice? Has anyone actually checked to see if experts don't have an even greater openness towards learning and novelty?

About giving up on projects and how the ego plays into that, I don't think in such black and white - giving up = bad, persevering = good. Sometimes you need to give up in order to find a better approach. There are reasons why this instinct is present in our species (something to do with the exploration exploitation trade-off). We can't paint over it with self help advice.

Comparing yourself to others is bad? Why? It's an evolutionary advantage to learn from the experiences of others. By doing comparisons you can calibrate your values. Competition is a great motivator. Having a row model can be fast way towards improvement. Comparison between peers is like a second order metric, first order metrics relying only on self.

The advice about not comparing yourself to others is useful only in a limited setting - where you devalue your accomplishments and have nothing to gain from it. But when comparison motivates you to improve, then it's actually not bad. Also when comparison prompts you to take action and avoid a crisis you could be spared a lot of suffering. Comparison can act like an alarm. Another function of comparison is to make groups more cohesive - if they form a common culture they can function better - so aligning oneself to the group can be beneficial for all.


Pretty interesting article. If this spoke to you, I recommend reading "The drama of the gifted child" by Alice Miller.

Don't be put off by the title, it's a wonderful read no matter if you think you're gifted or not.


Counter point:

I read that book this year.

It definitely contributed towards resolving some unresolved childhood trauma, and I'm grateful for it, but it was no walk in the park.


Why is that a counter point?


The parent was sharing the book as a wonderful read and that you shouldn't be concerned about the title.

My counter point was that while I enjoyed the book's results, the read/process was the polar opposite of wonderful.

I felt my anecdata might be helpful to those who might pick up the book.


This is so critical and so easy to forget. Every day is an effort to remind oneself of these basic truths.

“Whenever distress or displeasure arises in your mind, remind yourself, “This is only my interpretation, not reality itself.” Then ask whether it falls within or outside your sphere of power. And, if it is beyond your power to control, let it go.” ― Epictetus


I needed to read this. Just today I realized today that my ego was distracting me from learning and being effective at my job. I am a new dev, <1 yr experience, and I've often found myself not reaching out for help when I need it. For no good reason, of course, but for fear of outing myself as not knowing something I perceive others might view as basic.

I need to teach myself to take the ego hit and that in the long term it'll pay off more than independently struggling on a problem.


I'm pretty senior (over 10 years), and I see this a lot with my younger colleagues. They will struggle for hours and hours before reaching out for help, which is obviously counterproductive.

I try to counter this by reaching out myself when I need help, which is pretty regularly. I'm hoping my younger colleagues will see that even the old fart needs help sometimes and isn't afraid to admit that he doesn't know something.


"This is ego distraction. It’s about putting off uncertainty till later to buy temporary relief. To protect our ego from a perceived threat. Things like hard problems, the possibility of failing publicly, or negative feedback all become threats when they’re linked to my identity."

I disagree with this [at least for me personally]. Being a good programmer is just straight up not something that's part of my identity. There's millions of people who are much, much better at it then me, and that's totally fine.

The reason that it's hard to work on hard problems is because they're hard! Sometimes, programming can be a really difficult or a slog of an activity. It's same with mastering any skill. Learning to play guitar, becoming an Olympic athlete, whatever. You can't Zen Buddhism your way out of the fact that you're going to spending years and years practicing until your fingers bleed, or until you're completely exhausted, etc.


> You can't Zen Buddhism your way out of the fact that you're going to spending years and years practicing until your fingers bleed, or until you're completely exhausted, etc.

Many Zen Buddhists know how long and hard practice for mastery is. Practicing meditation is in many ways skill acquisition as well, which is why it is called a practice.

You’ll of course be spending years and years practicing programming, and the insight that ego identification gets in the way and one must practice beginners mind is a simple yet deep understanding that comes from years and years of practice.


> You’ll of course be spending years and years practicing programming

You can't just 'of course' this! I mean, you can, but that's the whole point. If you, or anyone reading this enough cares enough about being a great programmer, you wouldn't be on this site in the first place. Which is fine, I enjoy wasting time on here as much as anyone. But the people who are actually really good at programming? They're not reading blogs about ego. They're not writing blogs about ego. They're programming.

Look at what Fabrice Bellard has accomplished in the last 20 or so years: https://bellard.org/ . QEMU, FFMPEG. I will never even be close to the level that he is. I'm much closer in relative skill to the person who just wrote their first hello world yesterday, and I've been programming for 15 years or so. And that's totally fine with me. Programming is not my only interest in life.

For the blog author, it seems that they're searching for a reason that they're not as good at programming as they think should be. I mean, he's already a Staff Engineer at Circle CI. He's not someone who's been programming for a year. It's very possible that's he pretty close to being as good as he'll ever be. Sure, he'll keep improving, but he'll never be Fabrice Bellard. If he was, he already would be and wouldn't be writing a blog about why he's not.

So what I would say to him is: that's fine! Life is not just programming. "The Second Truth is that this suffering is caused by selfish craving and personal desire." You wrote an article about how your ego gets in the way of you becoming a better programmer, but its your ego that makes you want to be a better programmer in the first place!


You can certainly work towards a deeper ability without attaching to the outcome or to even use ego-craving to get there.


Your ego drives every action of your existence, so no, you really can't. What you can do is be self aware of that reality.


That hasn’t been my experience, but ok.


wow ! well said ! You always doing what you REALLY want for that moment. period.


> It's same with mastering any skill. Learning to play guitar, becoming an Olympic athlete, whatever. You can't Zen Buddhism your way out of the fact that you're going to spending years and years practicing until your fingers bleed, or until you're completely exhausted, etc.

The point of the article isn't that there is a shortcut around having to put in the time, it is that ego sabotages your efforts to put in the time, and if you're somehow coerced into spending the time anyway can prevent you from receiving the expected benefit (eg. just staring at the code on the screen probably won't help, unless the problem is literally a typo).


>it is that ego sabotages your efforts to put in the time

Sure, but what I'm saying that that statement is almost meaningless in the grand scheme of things. Is not letting your ego get in the way a necessary part of getting to mastery? Absolutely. But no matter how much you change your thought process, or analyze the problem, you still need to eat the proverbial whale.

Think of all the people who've ever played basketball and have had any aspirations of joining the NBA. For the vast, vast majority of them, they just were never going to be good enough. They didn't have the talent. It didn't matter how much they practiced, how much they got their ego out of the way, how badly they wanted it. Only about 3,000 people have ever played in the NBA.

Now, I'm not saying you need to be in the NBA to be a "successful" basketball player, or whatever you want to call the equivalent of that for programming. What I'm saying is that everyone has ceilings for how good they can become at something. Maybe on the relative scale, the best you can ever be is a pretty good programmer, and that's it. No shame in that!

After all, part of Zen Budhism is accepting who you are and your limitations.


> After all, part of Zen Budhism is accepting who you are and your limitations.

The teaching is that all beings are capable of enlightenment and outlines a path to accomplish that. The concept of "you" is part of the problem and meditation on sunyata can offer insight to that.


Enlightenment isn't a thing that can be accomplished. That's an oxymoron.


> I disagree with this [at least for me personally]. Being a good programmer is just straight up not something that's part of my identity. There's millions of people who are much, much better at it then me, and that's totally fine.

Then why did you call yourself "good", why not just a programmer? ;-)


I didn't. I said that being a "good programmer" isn't part of my identity...


Oh yeah, you're right. Hah! Sorry. It is part of my identity, which is problematic, but that's my sh*t I've been working on (for years...)


This is an interesting article because it made me consider how I think personally as a highly productive programmer but also a leader at my company. Even if I didn't touch the code, I am personally responsible for it considering that I am the lead and have - or should have -reviewed it before it went out. If you go into the assumption that you are accountable no matter what, it isn't really a big deal. If a subordinate f'd up, then you are still accountable because you should have caught it.

Thinking this way is very liberating because it means everything is your fault, but.. you are human and humans make mistakes.. so that means that this is a learning experience. If your mindset is that we are constantly learning, no mistake can ever really touch your ego.


A great book where the last quote in this post pulls from is Suzuki’s “Zen Mind, Beginner’s Mind”. [1]

I often find the solution to many of my problems is to go back and practice from this mindset. I coincidentally went back and reread selections from this book a couple weeks ago, as I found my ego had been creeping into many facets of my life recently and I needed to go back and be reminded to practice with this mindset.

[1] https://en.wikipedia.org/wiki/Zen_Mind,_Beginner's_Mind


Wonderful writing. I get the same feeling about the ego thing. What makes me wonder is there are people who would like to write about ego and coding in great detail. I admire. Truly resonates with me.

from what I see others comments in HN. "Points and Counterpoints" Every article or idea doesn't work for everyone as we are a complex cocktail of ideas and impressions. if some idea resonates with you, you have found your type of the idea. so enjoy it else don't resist the idea wait for next one that might work or not. thank you for sharing in any case.


This is really good. Not much else to say here, other than ‘thank you’.


Yes , quiet good introspection, really interesting article.


There is a predictable pattern to these types of posts and these types of life experiences.

You have a problem, you discover a potential solution and it seems to be working. You get excited and you try to make sense of it and you want to tell everyone because it is such a game changer! Then some time passes by, the emotions fade and you arrive at a new perspective - that your life hasn't changed that much or if it has, now you have a new set of challenges and a new potential solution will come your way sooner or later.

It never ends, unless you at one point recognize that it never ends and cease wanting to be better and wanting to understand it all so much. It's not that you purposely cease trying or wanting, it's that you relax the wanting, because you know the problems will be there tomorrow and the day after, no matter how much you try.

That's when 'it's the journey, not the destination' finally sinks in and life takes on a new quality :)

For some people, it happens when they are reminded of death and the inevitability of it all, for some when they've burned through their health enough that they can't do it anymore, for others it just occurs to them one day - I can't keep up with this bigger, better, faster, stronger culture and frankly, I don't want to, either.


I can relate to the parallels between programming, and athletics. I like to consider myself an athlete, and if there is one thing I have learned about sports it's that the process of falling from the top to the middle of the pack can happen in a week despite that it can take years to get from the middle of the pack to the top.

So when the author was talking about how being an expert is really just a matter of become a great student I was quickly reminded of my golf game. Where I often time find myself with the lowest handicap I have ever had without actually feeling like I am improving. I shave a stroke one day. Then another.. and another, and before I know it I am a 3 instead of a 10.

That being said, I wouldn't say I have an ego problem in coding myself because tbh I have always felt like a bit of an imposter. I think my imposter syndrome has actually ended up being a good thing over time in my career as a programmer. It seems to have kept me grounded, and as the author suggested is a good thing, it seems to have kept me in the forever a student mentality.


The author has overfitted for his own psychology. There are a lot of assumptions in there. My brain doesn't work like that at all.


Reluctantly agree, as I fit the author profile.


> Top students convincing everyone else to stop trying. Or, great engineers convincing the rest to stop trying.

This happens in a more literal sense as well. Since we compare ourselves to each other, it makes sense that more experienced engineers love feeling like they're ahead and beyond newer engineers, and it bleeds into behavior.

For example, engineers love asking candidates obscure questions in interviews - as if not knowing a specific JVM perk makes the candidate less of an engineer.

Let's also not forget the severe elitist attitude some engineers have when interacting with others. It's almost like everyone else is trash. For all the work he's done, which I admire, Torvalds was seriously toxic to interact with.

It all ties back to us wanting to compare ourselves to each other.


> My ego creates a tight bond between my work and my identity. Linking my self worth to how well I do my job.

Same experience here, and probably for a lot of devs. It took a good 10 years or so to break out of this. If I couldn't get something working that I expected to work, I silently took it out on myself. Must not be good/smart enough. And so I'd beat my head against the wall. Once I started letting broken code or an unfixed bug wait until the next day and saw that the sky didn't fall, and that I could eventually fix the issue, I started to trust myself more, and just let things be. Some days, everything works. Some days, nothing works. Your build fails and you spend hours updating some obscure library, and you don't get feature X done that day. It's really your decision whether or not you let this effect how you feel.


I think people get wrong is mostly is part about judging "how well I do my job". They come up with insane expectations for themselves.


this is a great blog. i don't think it would have really made sense or resonated with me if i hadn't been in therapy the last year or so almost exclusively working on this. which is what i want to really endorse if you're someone curious or in pain. there's ways to work on it.


Did anyone else find the "Relative Distance" section seeming wrong. Not the harvard-vs-random school thing but the conclusion they draw, that you perform relative to your standing. How are students aware of their positions? Do some schools post rankings or something? When I was in school you're standing/grades were private and no one knew another students standing (aside from the obvious, people failing out and such).

If I was guessing at a cause, I'd put it more on the teachers/school. That they have it in their mind that there should be a range of performance in the students and they enforce that idea. Eg. if everyone gets an A in their course they start making it harder until they get the distribution they expect.


Many universities grade on a curve. X% get an A, Y% get a B, etc. If you are getting C’s in your classes you can infer that you somewhere in the middle 30%, A’s and you’re likely in the top 30%. You are correct in saying that you don’t know who is the best and worst in your class but all you need to know is where you stand for Relative Distance to work.


Seems wrong to me too, but for a different reason - i would expect a better school to not only attract smarter students, but to also be harder. So it should balance out to the same curve.


There is a perspective shift that comes (usually but not always) with age I think.

When I was younger I got into computer programming, for the first ten years (1987-1997) I thought I was hot shit because I could do things with computers that no-one else I knew could even understand (with the exception of a family friend who was a programmer in aerospace) then I ran into other programmers on-line and realised that there where other much better programmers in the domains I was interested in (strangely I never got into programming games, I always liked utilities and 'productive' stuff).

So I doubled down and resolved to be the best programmer I 'knew' again except this time I knew hundreds or over the years thousands of programmers an impossible treadmill.

Sometime in my late 20's/early 30's (so ~2007-2008) I realised that not only wasn't I ever going to be the best programmer I knew, I really didn't know much about programming in the general sense if you look at the whole field (no-one does really except the odd person) so I re-framed it, I was going to be a better programmer than the me of a year before and focus on the other skills I'd let languish over the years what I'd often derided as 'soft' skills (I don't think I was ever an arse-hole but I was the guy who'd sit in the corner muttering with the headphones blasting thrash metal).

In the end what I realised was that after all this, I like programming, I like providing value and when it comes to work the best thing I can get is feedback from a user whose life I've improved by making whatever I've touched that little bit better.

If I can do that then it was a good day.

The freedom from all this is I learnt to play again, if I'm interested in functional programming I'll go poke at that for a bit, if I'm interested in algorithms I'll go poke around over there - free from the the self-imposed need to compete I get to satisfy my own curiosity and nurture the devs on the team I run.

With 7 billion people on the planet it's statistically unlikely you are ever going to be the best and even if you are it's likely in only one dimension.

I noticed that the programmers I normally really admire are all older than me and seem to be excited/happy about technology and wondered how they kept that enthusiasm for so long in an industry where so many seem miserable and I think I can hazard a guess now.

Oh and because the universe loves a punchline, I have a dev on my team now who is determined to prove himself the best programmer, never says a word and listens to thrash metal all day while muttering, he's talented so I'm curious to see how he figures it out.


I found this quote from the article quite motivating:

> “I cannot say this too strongly: Do not compare yourselves to others. Be true to who you are, and continue to learn with all your might.” ― Daisaku Ikeda, Discussions on Youth


Self-improvement piece once again. Why are people so obsessed with this? Imposter symdrom seems to be a thing I really dont get. It never really bothered me if i actually caused the bug when looking for the root cause- ok maybe sometimes you worry if you messed up just before the release but thats pretty much it. Maybe realizing that you dont need to be a great programmer to make a great piece of software helps.


Imposter syndrome for me is kept in check if I keep within my 'quota' of mistakes. It doesn't bother me if I am the author behind a bug (Normally every bug is an opportunity to understand the software we've built better), but if consecutive bugs come back to my work, then I notice it. And I notice it in others too, and then I want to see them putting effort into improving their skill, etc..


Quota.. is that something like a percentage of bugs you feel responsible for? That again would be comparing youself to others- not saying its a bad thing unless its bringing you down. Dont really know what im talking about here as i have never suffered from that. Insecurity yes, imposture sydrom no.


I have my best coding sessions after I start the day with fifteen minutes to half an hour of mindfulness meditation. That calms my mind, makes me „less anxious“, lets me go into new directions without feeling the urge to walk away, procrastinate, do something else.


Incredible post! I've been struggling with anxiety, depression, and a lack of motivation at work for the past year or so, and slowly coagulating towards the feeling that I needed to focus on my work.

This article feels like it gives me the how - thank you.


This was an amazing post!

100% spot on that the external distractors are easier to manage than the internal ones. A buzzing phone, tempting social media websites, and loud rooms all tend to be relatively easy problems to fix. As for internal distractors, I feel like telling a personal story after reading this.. There are two internal distractors I've recently noticed myself struggling with:

1) A busy mind.

I often find my brain meandering on ideas or conversations completely unrelated to the work I'm trying to do. Daydreaming, imaginary arguments, and unnecessary tangents all tend to creep in (esp in the afternoon for some reason). I'm glad this post touched on Zen Buddhism and the beginner's mind. At risk of proselytizing, I have to say the best way I've found to manage a busy mind is through meditation. Consciously setting aside 10-15 minutes everyday to practice letting go of thoughts has helped build a (tiny) mental muscle which I can sometimes use to bring my focus back on the things in front of me.

2) Alcohol.

This is a bit of an external distractor, but also an internal one. In college, I was able to stay up all night drinking and coding. No longer! I find it amazing how insanely less productive I am even after a single glass of beer. I now get tired shortly afterwards and have immense difficulty focusing. Perhaps as the article mentions, the alcohol is wrapping up my ego in the task at hand. I don't have a drinking problem, but I now solve this by consciously deciding how to spend my next couple hours. "Am I going to grab a drink and take an extended break (perhaps for the rest of the day)? Or am I going to grab a water/tea and continue working?" Gone are the days when I could reliably reach the Ballmer peak (https://xkcd.com/323/).


For alcohol, throttling the bottles worked for me. I consume one particular day of the week instead of any random day. I consciously say no to any urges/temptations in between. This has helped me to control my habit.

For controlling distractions, other hobbies like music or something else apart from programming keeps me ticking.

For internal distractions - I agree with the post. Separate me from problem I am trying to solve.


> Alcohol

I've found that I almost can't program anything sensible after even very small amounts of alcohol, even half a beer or 25ml of vodka is enough. It's either alcohol or programming for me. I don't drink too often, but when I do, even after large amounts I don't have hangover and I remember having memory loss only once.


One of my teammates said he thought he coded best with a slight buzz, but who knows - maybe he was just creating tech debt and vulnerabilities at scale.


For me, having one beer when coding helps because it removes my discomfort of sitting in a chair for very long. If I've been sitting for a while and coding, my mind is willing but my body is complaining about the discomfort. A bit of alcohol goes a long way to help that.


Interesting. I'm in my 50s, and I've found the Ballmer Peak to be a real thing for me ever since college. It's consistently been 2-3 beers. More than that brings a very steep dropoff in productivity, but I wonder if those 2-3 beers are de-egoing my programming. I suspect in my case they are.


I'd strongly tend to say "No" - alcohol is liquid ego, in my experience. Even a single beer will diminish awareness and consciousness.


> Daydreaming, imaginary arguments, and unnecessary tangents all tend to creep in (esp in the afternoon for some reason)

And then

> 2) Alcohol

Have you had your B12 levels checked? You might want to try a supplement.


Make sure you stay hydrated, alcohol can dehydrate you.


So, one part of my brain is an adult another part is a child acting from beneath the consciousness.

Then most of these articles usually say the recipe is to have the adult very closely observe the child and hit it with a stick any time it wanders of the "desired" path.

No.

How do you expect it to work between a real parent and a real child? I think it would fail miserably and annoy both sides. I'd suggest coming up with some nasty parental tricks instead. Also adjust them after some time, because they do tend to stop working.


There is an interesting parallel drawn by a famous theologian: sin is self curved in on itself.


>It’s clear to me now that it's not about what I know, but rather how I think that's different on these days.

yup


I think software ecosystems are currently infested by corporate interests. I feel kinda funny for saying this. It sounds like some kinds of left-wing politics. But software has become increasingly consolidated. There're 5-10 companies in the US that drive most of its development. They even picked up the most popular open source projects. When you're part of an ecosystem, your objective is to work the system. Making software has lower priority.

It's extremely hard to make money being an independent software developer. There's a lot of noise and money in the market. It's hard to compete with marketing from big companies. You have to work for a corporation or a startup with funding. You have to be part of an ecosystem. When you are accepted to a program like YC, you win an entry to an ecosystem.

Can you be a software artisan nowadays? Can a small team develop and sell software without having an ecosystem behind it? I've seen some examples of this, like Ruby on Rails, 37signals. But they are rare exceptions.

I'm currently working on an open source project. Let's see how long I can be independent for. Check out my project :)

https://github.com/vidalab/vida


It might seem like a tangent by I would recommend studying (self-)hypnosis.


Glad to see Jose writing again! Great post as always from them.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: