Routes into Computing / Computer Science/ Software Engineering

8 min readJan 17, 2022


This short article is to explain what routes are available who want to work in computer-related fields — whether it be game development, general software development, or anything else.

First of all, there are three main ways into computing jobs:

  • University / College degrees. The best option for pretty much everyone.
  • Bootcamp courses: Can be a good choice in rare cases.
  • Self-taught: Can be a good choice in rare cases.

I will explain each one in turn.

Table of Contents

· University / College degrees
· Bootcamp courses
· Self-taught
· Working in the software industry
· Small companies and startups
· Large companies
· Crunch
Gaming industry conditions
· Developing software on your own

University / College degrees

University/College degrees take 3 years (for Bachelor’s degrees) or 4 years for Master’s degrees, and of course comes with costs in a lot of countries.

As a rigorous science subject, most universities/colleges offer good courses in this field. There are various different degree types, but by far the most common (and most useful) are Computing / Computer Science degrees, which set you up for all kinds of jobs in the field, as they teach a comprehensive understanding of the subject that is difficult to acquire any other way. There’s nothing wrong with more specialised degrees like Games Programming degrees, but be aware that this can and will limit your other career choices upon graduation somewhat, since you are more ‘specialised’.

Most computing degrees have a lot of options within them anyway — those wishing to go into games can choose optional modules in their second/third/fourth years that complement that, or whichever other career they think they would like to go into.

Bootcamp courses

Bootcamp courses are usually 6–12 months long, and mostly are there to train people for one particular employer. They are shorter than degrees because you only learn things that are relevant to that employer’s technology and not anything else — and are generally light on fundamentals, which in this field matter a lot.

Bootcamp courses — by and large — don’t teach you much computing at all. They teach you the bare minimum to do one specific computing job. They didn’t manage to reduce 3–4 years of information into 6–12 months by magic — many of the transferable skills and other things not directly relevant to that one employer are cut out.

This doesn’t make them completely unviable — it’s plausible that you could finish the bootcamp, love the employer, and be happy ever after. Be aware, however, that skills gained from bootcamps aren’t very transferable, since your bootcamp-learned skills will be much less useful to another employer who doesn’t use the same systems or who requires knowledge of different languages and proper fundamentals (and who isn’t likely to recognise your bootcamp qualification as useful to them).


This is theoretically possible, but very difficult. While there are excellent free resources on the web that will teach you many aspects of computer science, most cannot really teach you how to apply them effectively — it’s easy to learn how to code, but hard to learn how to code well and design your code properly without underlying knowledge of the subject as a whole (and half the challenge is knowing where to find said knowledge, something university helps a ton with.)

On top of that, computer science in many universities involves such topics as logic (prepositional / predicate / temporal and modal), algorithms, compilers, and more topics which are very useful for *understanding* how to think and code well, and that are not the sort of thing people think to learn on their own. Books on these subjects aren’t always easy to learn from on their own, whereas university lecture courses are specifically for teaching these subjects.

Realistically, it’s like self-teaching a lot of subjects: since you don’t know the subject, you also don’t know which parts of it are important to learn, which bits to learn in which order or what to place emphasis on, etc. That doesn’t make it impossible, but it’s difficult and can often lead to big gaps in knowledge.

Working in the software industry

Like most industries, computer science has companies both big and small, and the differences in how they work can appeal (or be horrifying) to different developers.

The primary pros and cons are discussed below. A fair few of the below points are applicable to most industries.

Small companies and startups


  • Usually using more modern technology standards
  • Can be more open to flexible working and other “unorthodox” working methods
  • Smaller teams can make for a more social workplace in some cases


  • More likely to lean on crunch than large corporations, outside of certain industries like video games where crunch is everywhere. Small companies are more likely to try and lean on developers to work overtime for free “out of passion for the company and our vision” or “because we’ll fail without you” or other such nonsense.
  • Less likely to be accommodating of specific needs in the workplace due to lower funds and office space
  • More prone to instability due to growth, higher impact of a bad month or year

Generally speaking, small companies and startups are more intent on using ‘modern’ processes (the latest software standards, hardware configurations, and so forth), as they’ve got the freedom to choose that without impacting existing setups. Since they don’t have large established software or hardware networks to deal with, they are less concerned about change and following specific set in stone rules in relation to coding work — — but this can be both a blessing and a curse.

On the one hand, this usually means you won’t be spending your days using a ghastly 25-year old system that nobody really knows the internals of anymore and seems to fail whenever it feels like it. On the other hand, when you’re in a company that doesn’t have well-established standards, the constant changing and shifting can be extremely irritating and detrimental to actually getting work done, especially if you end up working for someone who calls themselves an “innovator” or “entrepreneur” as is common in many startups (English translation: person who forces unpaid overtime and unnecessary processes on others, then claims the credit for their work).

Large companies


  • Often superior job security and salary
  • Often better facilities
  • No constantly changing standards… meaning you do at least know what you’re working with rather than things changing every five minutes


  • Unless you are in a very skilled or specialised position, you are easily replaceable
  • No constantly changing standards… but also a nightmare to try and get good standards adopted
  • Frequently have very old legacy codebases that can be a pain to work with, so don’t be surprised if someone asks you to code in FORTRAN or Tcl (I may have had to work with one of those…)

Generally speaking, small companies and startups are more intent on using ‘modern’ processes (the latest software standards, hardware configurations, and so forth), as they’ve got the freedom to choose that without impacting existing setups. Since they don’t have large established software or hardware networks to deal with, they are less concerned about change and following specific set in stone rules in relation to coding work — — but this can be both a blessing and a curse.


Crunch is a practice most often seen in the video games industry, referring to a large period of time before the release of a finished game in which teams and developers are pushed beyond their limits in working hours (frequently heavy unpaid overtime) in order to get the game finished.

Crunch is largely caused by unrealistic financial and time expectations on company boards, leading to projects that don’t have enough time to be completed: many companies ‘solve’ this problem by pushing devs harder instead of accepting responsibility for badly planned or executed projects.

Gaming industry conditions

It is well worth noting that if you want to go into computing in order to make games — you should be aware that the gaming industry is notorious for being one of the worst-paid and worst-treated computing industries in terms of working conditions.

This is for a few reasons:

  • Games companies can lean more on their developers’ ‘passion for the game’ to pay less and make them not quit mid-project, which isn’t really true for more mundane software that won’t get mass appreciation by the public and isn’t usually pursued with quite the same passion.
  • Since game release dates are made public, crunch is unquestionably a bigger problem in the gaming industry than any other software industry due to the immense pressure this puts on companies not to change their release dates (including factors such as console release dates and competitors).
  • Unlike a lot of modern software that is continually released and updated, games need to go from nothing to fully functional at launch and are more often closer to ‘one-and-done’ projects with hardware and performance constraints to work within, which when combined with commercial pressure often leads to unrealistic project timeframes and inevitably crunch.

Developing software on your own

My main view on this is that unless you are absolutely sure of what you’re doing, don’t try it, and even then don’t try it 99% of the time unless you’re doing something very simple.

Most startups in the coding world don’t succeed by having brains or smart ideas, but by pushing developers beyond their limits, paying badly, and breaking laws. If that’s what you want to do, then do it, but nobody will care if you fail and I for one enjoy seeing any such company fail miserably.

The vast majority of startups fail, and there are two giant, unmistakable reason for that:

  • Bad luck. You can’t control this one, it’s composed of having insufficient market information / timing / etc (and no matter how much research you do, you can never have a complete picture or even close to it).
  • Incompetent project management and overconfidence. You can control this one.

Being a developer does *NOT* make you a good project manager, a fact many indie games developers and startup ‘entrepreneurs’ miss entirely. Just about every crowdfunding platform ever is littered with failed indie games and other software startups who failed to recognise this.

Anyone remotely competent in developing software should recognise that by and large, developing a game — even a very small one — is extremely difficult to do alone or even with a small team unless they all have prior industry experience and a project manager. Even then it’s no less than a mammoth undertaking that will hit many unexpected hurdles and problems; most crowdfunded games drastically underestimate just how many problems and issues they will run into, not only in development but in team coordination, compatibility, graphics and other areas.

As a result, the financial and time requirements for a game are often drastically underestimated, leading to either a terrible half-finished hackjob of a game or an abandoned game that never gets finished. The lesson here is to know your limits: don’t try to do something you can’t do. Game making requires many skills besides software, and even if you have all of said skills, there is almost zero chance you will do it all on your own. Doing it with a team requires solid project management, an acceptance that it will not go anywhere near as smoothly as you might expect, and a willingness to only get funding based on that assumption.

Software projects outside of games aren’t much different, but require different skillsets. Simple applications can be done by a lone developer, but it remains vital not to overestimate your skills: UI/UX and good frontend design is not a skill developers automatically have (and many do not). Network security and administration is often best left to specialists in those fields unless you want highly damaging and embarrassing security breaches, when it comes to e.g. making a web application at scale. Accepting this, and recognising you will almost certainly need others on the project for it to work, is vital.

None of this is to discourage people from pursuing their own projects: on the contrary, it’s to encourage you to approach it in the right way, because the alternative comes with severe consequences.