I thought about that, but my list was focused on things that Google was leading in but let the market get away. Amazon got into cloud before Google did.
We are a small startup (< 20 employees), looking for a mid-senior level backend Go engineer to develop features on our backend server, which is 100% Go. We use Postgres as our primary data store. Prior experience in Go preferred but not required. Experience with SQL preferred.
Job is remote, company is based in USA and most employees are in North America.
This is for a position with core engineering, a new engineering team at Bread. This is a great opportunity to be one of the first engineers on a team dedicated to building high impact services at an established company.
Tech Stack: Go + Postgres + Kubernetes + SOA
Knowledge of Go is preferred, but any strong background in backend engineering will do.
As a self taught developer I can say your point assumes self teaching is all about following your interests. This is simply not true. Self teaching is a combination of following your interests and figuring out your areas of weakness and attacking them. Part of my self learning involves browsing college degree requirements or graduate programs to identify must know topics, I learn them regardless if I have a particular interest. If college is the only thing forcing you to think outside the box, what hope is there for your long term learning.
My point assumes people bias their learning towards their interests. That isn't an opinion, that's just human nature. That's not to say you can't learn things that don't interest you, but if you aren't going to do it formally you certainly have to be driven by filling those gaps and most people aren't.
Learning to think outside the box and learning about specific specialties that you don't find interesting aren't even remotely the same thing. If you read what I wrote, you'll notice the example areas I gave. None of them were "thinking outside of the box", they were fields that didn't interest me.
The value they bring is that I learned a common language. I can talk to ML engineers or Compiler Engineers in a common language because I have experience in those fields that I wouldn't had I taken a different route. That isn't even remotely a requirement for my field. It's just useful for discussions with other people.
Maybe you'll spend hours and hours learning compiler design and implementation so you can have a friendly conversation with a colleague at lunch, but I'm not going to. Formal education forced that on me. That's been healthy for my professional networking, but that's about it. I still consider it useful, but had I skipped the CS degree I wouldn't do it. Nor would most people.
At the same time people who are self taught typically have more time and energy to attack things in more depth. My cs classes are pretty 'standard' and on my free time I can go on long tangents where I really learn. I learned about assembly programming and the bacics of how cpu's work from self learning rather than getting a very simplified understanding from a class. There are many things in cs that can not be learned in 1 semester and people who are self taught have the ability to take things at their own pace and jump to different topics, something that busy college studens have little time to do
No, I'm speaking about human nature. Most people don't learn things they dislike because it fills a knowledge gap. How many people do you know with hobbies they hate?
It's natural to gravitate towards things that interest you. Some people are driven by gaps and learn things regardless, most people do not. That's not even controversial.
Interesting, Ive had the opposite experience while driving back to NYC from other east coast cities. Google generally takes into account that traffic will increase as I approach NYC, but the initial estimate is always off by at least 1-2 hours.