I watched a YouTube video this morning about a 13 year old boy taught himself to make iPhone apps and got famous for it. He took an internship at Facebook and then started working there full-time. There were TV stations and news websites that interviewed him and wrote about how he’s helping his family financially and how any teenager can start making tons of money if they just learn to code. And the story was nice and inspiring and stuff, except there are tons of kids that do the same thing and nobody writes articles about any of them. He’s probably 18 or 19 now1 and still working at Facebook as a product manager. How’s he feeling now? On the other hand, I’m a college senior, dreading the day when I have to start working like a grown-up and wondering if I’ll miss college and confused why people can’t just stay in college forever. He never went to college. He had probably gotten accepted at lots of different schools (did he even get a chance to apply?), but he decided college wasn’t worth the opportunity to work at Facebook and pull his family out of their crappy financial situation. Cheers to him.
I felt exactly the same way in high school. But I didn’t have a compelling reason to start working or the balls to deviate from the Good Kid Story™. I started making websites when I was 10, and by the time I finished high school, I could churn out CRUD web applications like any other rank-and-file software developer. Part of me honestly thought that I could skip a few semesters of class once I got to Berkeley, because I already knew about for-loops and I could write Hello World in a handful of languages. I thought college was going to be the place where people learn about the less-useful theoretical parts of programming. They’d teach me what a tree was, even though I never had any reason to use anything but PHP’s ubiquitous ordered hash map. I thought it wouldn’t be anything that I wouldn’t have learned anyways, if I just kept writing more and more code. And I was partially right, but also very wrong.
Getting a proper CS education is really important, and I wouldn’t recommend that anybody drop out or skip college, just so they can start working, especially if there isn’t a strong financial reason to do so. However, there’s two hard truths that people don’t like admitting about CS education: 1) most of the stuff taught to undergrads is also available on the Internet, and 2) most people who get a CS degree are still cruddy programmers. So, school isn’t irreplaceable and it’s not like attending school will magically transform you into a mature grown-up programmer. But that’s really not why getting a formal CS education is important.
After 7 semesters, it’s still hard to say exactly why people place a lot of value on getting a formal education in computer science. Most people need to be taught programming, because they have no experience and are in no shape to do anything productive with a computer. But for all the programming prodigies of the world, there needs to be another reason. I can say that I’m a much better programmer than I was four years ago. It always seems like the code I wrote the previous year is a pile of garbage2.
School forced me to learn things that I never would have learned on my own (because they were irrelevant to my own projects) nor would I have learned while working full-time (because they’d be irrelevant to the work I’d be doing). In high school, I had no idea people could write programs that did more than loading and saving data to a database. The classes I took actually expanded the range of what programs I thought were possible to write3.
When I taught myself things as a kid, I would enter a tight loop of learn-do-learn-do. Most of the code I wrote were attempts to get the Thing working as easily as possible, which ended up leading to a lot of frustration and wasted time. It’s hard to piece together a system before you understand the fundamental concepts. And that sounds really obvious, but a lot of programming tutorials seem to take that approach. They’ll tell you how to do the Thing, but they don’t bother giving you any intuition about the method itself. On the other hand, college classes have the freedom to explain the Thing in the abstract. Then once you start doing it yourself, you’ll know exactly what to look for4.
It’s really unfair to make a teenager make their own decisions about work and college, because you really shouldn’t be punished for making stupid life choices as a kid. Teaching myself programming as a kid was useful, but frankly I was a terrible teacher. But I’ve gotten better at that as well. This is my 5th semester as a teaching assistant, and I’ve picked up all kinds of awesome skills, from public speaking to technical writing, not to mention actual pedagogy as well. I’ve spent literally a thousand hours working on my tooling, because college convinced me that it really does matter5.
They say that it takes 10 years to really master a skill. Well, this is going to be my 12th year as a computer programmer, and I still don’t feel like I’ve mastered anything. I guess everybody learns in a different way, but it really sucks that society has convinced teenagers that college is optional/outdated. It’s easy to lure teenagers away from education with money and praise, especially because it’s really hard to see the point of a formal education when your entire programming career is creating applications that are essentially pretty interfaces to a database6. It doesn’t help that college-educated programmers are sometimes embarrassed to admit that school doesn’t work for everyone.
I wonder if that iPhone kid is disappointed with the reality of working full-time in software development. The free food and absurd office perks lose their novelty quickly.
- I have no idea actually. ↩︎
- Some people say that’s a good thing? I’ve realized that code is the enemy. The more code you write, the more bugs you’ve introduced. It’s incredibly hard to write code that you won’t just want to throw out next year. Code is the source of complexity and security problems, so the goal of software engineers is to produce less code, not more. When you have a codebase with a lot of parts, it’s easy to break things if you’re not careful. Bad code is unintuitive. Good code should be resistant to bugs, even when bad programmers need to modify it. ↩︎
- Little kids always tell you that programmers need to be good at math, which actually doesn’t make that much sense when I think about it. You need some linear algebra and calculus for computer graphics and machine learning. Maybe you’ll need to know modular arithmetic and number systems. But math really isn’t very important. ↩︎
- A huge number of software bugs are caused by the programmer misunderstanding the fundamentals of the thing they’re interacting with. ↩︎
- My favorite programming tools in high school were Adobe Dreamweaver and Notepad. I started using Ubuntu full-time in 11th grade, but didn’t make any actual efforts to improve my tools until college. ↩︎
- Not to underestimate the usefulness of simple CRUD apps. ↩︎