Read articles and posts about technology and hardware Follow my startup journey and entrepreneurship insights View photo gallery and visual content Download and view resume and professional experience Visit GitHub profile for code repositories Watch educational videos and tutorials Connect on LinkedIn professional network Get in touch for collaboration or inquiries

Moving from software to digital logic design

Table of Contents

Contents

In response to a Hacker News commenter asking me for some thoughts on how to get into the chip industry.


Where I’m coming from

I am not personally a hardware design engineer. I’m a computer architect — focused specifically on novel processor design. I started out in web tech and gradually moved lower and lower towards hardware (over the course of ~14 years, starting from around age 16). My experience is as unique as the next person’s, and that shapes my perspective accordingly. Everything that follows is written from my vantage point.

From the context of the thread, I’m guessing you’re most interested in learning (and perhaps working) on Hardware Design - as opposed to any of the other things I listed. This is definitely a steeper learning curve than the other routes I’ve mentioned, as there’s less crossover of knowledge and skills. I even said in my original comment that SW (any role) -> HW Verification roles is readily possible, and SW (any role) -> HW Design roles is possible but it’s a gradual process, usually learnt on-the-job while in an adjacent role.

To get into the industry, I’d recommend finding ‘design-adjacent’ roles (like verification) and using that as the opportunity to learn from colleagues. You can then progressively side-step your way into design.

Unlike software ecosystems, hardware design is ‘closed’. There is little open-source stuff, and most open-source designs are not serious commercial designs that you can learn from (most, not all, but most). Open-source tooling exists but forms only a small subset of the tools from Synopsys, Cadence and Siemens. There are learning materials, lecture courses, etc. on YouTube and various university websites (and, for example, Bristol University’s entire Design Verification course is available freely) - but they’ll only take you so far. At some point, you have to be inside the walled garden (i.e. working in the industry) to be able to learn more.


The landscape

Unlike software engineers, who can move pretty fluidly from one specialism to another, the semiconductor industry has a lot of truly specialist roles. The skillsets are deep and specific and large enough, to each be entire fields of study.

Do people move between specialisms? Yes. Is it easy? No.

I’m focused on novel computer processor design. This is a niche of digital chip design. And digital chip design is a niche of chip design, which is a niche of … you get the idea. I’m not going to be able to offer advice on how to get into analogue chip design or photonics or a zillion other possible areas of the semiconductor industry. But here’s a rough map of how the main disciplines relate:

                        DIGITAL CHIP DEVELOPMENT
                   (according to AI: Claude Opus 4.6)
 
    ┌──────────────┐     ┌─────────────────┐     ┌───────────────────┐
    │              │     │                 │     │                   │
    │ Architecture │────▶│  Design (RTL)   │────▶│  Physical Design  │
    │              │     │  ★ THIS POST ★  │     │                   │
    │              │     │                 │     │                   │
    └──────┬───────┘     └───┬─────────┬───┘     └─────────┬─────────┘
           │                 │         │                   │
           │                 │         │                   │
    ┌──────┴──────┐       ┌──┴─────────┴───┐      ┌────────┴───────┐
    │             │       │                │      │                │
    │  Modelling  │       │  Verification  │      │  Validation &  │
    │             │       │                │      │  Bring-up      │
    └─────────────┘       └────────────────┘      └────────────────┘

                   ┌──────────────────────────────┐
                   │       SoC Integration        │
                   │    (spans the entire flow)   │
                   └──────────────────────────────┘

Here’s a brief summary of each discipline, with the key skills and typical job titles you’ll see in job listings.

Architecture

Defining what a processor does and how it does it: the instruction set, the pipeline, caches, memory hierarchy, etc. This is where I spend most of my time.

  • Key skills: Computer architecture fundamentals (ISAs, pipelines, caches), performance analysis and benchmarking, software/hardware co-design trade-offs, specification writing, familiarity with compilers and operating systems
  • Common specialisms: Processor Architect (CPU, GPU, AI/TPU/NPU, DPU, etc.), Microarchitect (crosses over to Design role - see below), Computer Architect (higher level), SoC Architect (in between)

Design (RTL) ★

Writing the actual hardware in Verilog or SystemVerilog. Turning an architectural specification into synthesisable logic. This is what the rest of this post focuses on.

  • Key skills: Verilog / SystemVerilog (RTL coding), digital logic fundamentals (timing, clocking, resets), synthesis-aware coding, FPGA prototyping, scripting (TCL, Python)
  • Common specialisms: RTL Design Engineer for Silicon or for FPGA, across SoCs, IP, NOCs, etc. etc.

Verification

Checking the design does what it’s supposed to. Arguably the largest discipline by headcount, and the most accessible entry point for software engineers.

  • Key skills: SystemVerilog / UVM methodology, constrained random verification, formal verification, coverage-driven approaches, scripting (Python, TCL)
  • Common specialisms: Design Verification Engineer, Functional Verification Engineer, Formal Verification Engineer, Design-For-Test (DFT), etc.

Physical Design

Taking the synthesised logic and placing it on silicon: synthesis, floorplanning, place-and-route, timing closure, getting to a GDSII file that can be manufactured.

  • Key skills: Place and route tools (Cadence Innovus, Synopsys ICC2), static timing analysis, clock tree synthesis, power analysis, floorplanning and constraints
  • Common specialisms: Physical Design Engineer, Backend Design Engineer, Timing Engineer, STA Engineer

Modelling

Building fast, abstract models (typically in C++ or SystemC) to explore architectural trade-offs, assist verification and predict performance before RTL exists. This is where C++ SoC modelling skills are immediately valuable.

  • Key skills: C++ / SystemC, statistical analysis and benchmarking, computer architecture knowledge, trace-driven and execution-driven simulation, performance analysis methodology
  • Common specialisms: Functional Modelling, Performance Modelling, Performance Analysis, Abstract Modelling, Benchmark Writing, Timing-accurate Modelling

Validation & Bring-up

Testing real silicon once it comes back from the fab. Lab work with oscilloscopes and logic analysers, getting the first software to boot on a new chip.

  • Key skills: Lab equipment (oscilloscopes, logic analysers, protocol analysers), embedded software / firmware, debug methodologies, board-level understanding, JTAG / trace / debug infrastructure
  • Common specialisms: Silicon Validation Engineer, Bring-up Engineer, Post-Silicon Validation Engineer, Lab Engineer

SoC Integration

Assembling IP blocks into a complete system-on-chip: connecting processors, memory controllers, peripherals, and interconnects together.

  • Key skills: IP integration and bus protocols (AMBA/AXI, TileLink), address map and memory map design, system-level verification, scripting and automation, interconnect design
  • Common specialisms: SoC Integration Engineer, SoC Design Engineer, IP Integration Engineer, System Design Engineer

Accessible roles for software engineers

The roles I mentioned in my earlier comment — verification and modelling — were ones that I think are the most accessible to software engineers trying to enter the industry. But if it’s Design you’re after, read on.

Note on AI-assisted writing

This section is the only AI-generated section of this blog post. I think this landscape information is useful but I didn’t think it needed my full attention to write it.


Learning digital logic design

To take a learning-first approach, you need to start from the fundamentals. It’s no good diving in on FPGAs, SoC design and Computer Architecture/Processor Design if you don’t understand the driving forces for design decisions.

There are two main driving forces for digital logic design (that I can think of in this moment anyway):

  1. The needs of software, and
  2. the constraints of physical implementation.

You’ll know the former if you’ve dived into low-level software like kernels, compilers, language runtimes, embedded software, firmware, etc. You won’t know the physical constraints without learning the building blocks of digital logic design.

Some textbooks I recommend include:

  • Digital Design: Principles and Practices (John F. Wakerly) — never mind that it’s a textbook, I actually enjoyed just reading this cover to cover. It is essential reading in my view.
  • Microelectronic Circuits (Sedra and Smith) — Useful to read key chapters. Not a book I have read cover-to-cover but one I dipped into where lecturers recommended.

Reading the history of processor design (books, Wikipedia, various blogs, technical press articles, etc.) also helps give a lot of understanding for the constraints of digital logic and why the industry has progressed to its current designs.

Once you’ve got the principles down, then doing some simple projects is the next step. Implement a simple UART controller. Implement a communication controller on top of that. Now implement a state machine, maybe a traffic light controller or a peak value detector or a discrete edge detector. Hook it up to an input signal, output ‘events/results’ over the UART. Control its operation with responses over the UART. If you’re feeling ambitious, try to drive a VGA connection to an old monitor.

A note on FPGAs: for the work I’ve done, FPGAs are for prototyping, not for the end-product. These are two equally valid but very different uses of FPGAs, and at a later stage you need to know which you’re trying to learn.

Once you’ve got the hang of these simple designs in FPGAs, you’re ready to have a go at modifying an open source processor design. I started with PicoRV - “pico risk five” - but there’s plenty of good options to choose from at this level. Trying to add an instruction is arguably the most instructive thing you can do (pun intended).

By this point, you’ll have a bunch of knowledge about what you can and can’t do in digital logic design. You’ll have picked up a real processor, capable of running real software and that has a real ISA manual, and modified it. At this point, you basically hit a wall. To my knowledge, this is pretty much the limit of what you can achieve on your own. To move on from here, you either start having to invest in increasingly expensive equipment (just to be able to run tools like Vivado on larger designs requires a high-performance machine of some form - a cutting-edge Apple M-series processor or top-end AMD/Intel systems with plenty of RAM). This really is best achieved by having a company pay for the equipment and pay you to do the work.

So the next step is to go to a company and take a ‘graduate’ (i.e. entry-level) design role. That’s a serious pay cut for an experienced engineer to stomach. Hence my original recommendation of doing design-adjacent roles where existing skills will be valued and the pay cut will be much less (if any).


Will AI help?

The questions included asking:

How the current AI tools make it easy (if at all)?

To which I say: yes, AI tools can help you.

Just, accept that they’re a tool, not a solution in and of themselves.

Also, you can’t learn chip design from AI tools. Sorry, but the quantity of training material is too low for them to be useful as a direct teaching tool. AI tools are great at web tech precisely because there is a vast corpus of production-quality (and worse) code for them to learn from, available openly. For all the reasons already mentioned, this is not true in hardware. This doesn’t mean AI tools are useless in hardware, it just means they’re currently (at the time of writing) much more severely limited.

So, what is the best use of AI tools in my opinion? Asking questions, particularly when encountering “how do I do X” or “what does this error message even mean” or “why won’t X do Y” situations. Hardware tooling is abysmal. The entire chip design industry knows this. If you’re trying to learn today, Google and Stack Overflow are of limited (and declining) help, and vendor documentation is often so poor it can actually get in the way sometimes of achieving your goals. This is where AI shines. This is where AI can take your fuzzy understanding of an issue or intention, ingest unintelligible error messages, and piece it all together with the stuff it’s been trained on (which includes supposedly-NDA’d EDA vendor’s tool documentation!) to get you to an answer.

If you ask AI today to write Verilog or SystemVerilog, the results are highly variable. Sometimes you’ll get good stuff. Often you’ll get graduate-level code that might work on FPGA but won’t be good enough to take you to silicon. And often enough to be an issue: you’ll get complete garbage. Still, AI can be useful, particularly for (shivers) TCL code.

If you’re trying to learn, don’t get the AI to write everything for you. You won’t have learnt hardware design, you’ll have learnt “instructing someone/something else to do hardware design”, which isn’t the premise of this entire blog post and discussion.

The original SemiEngineering article spent most of its time talking about AI (rather than answering the question posed in its title). Frankly, I think the ideas presented contain a mix of real possibility and outright fantasy. It would take a whole extra article to pick them apart, which is why I largely ignored the article in my comments on HN.

Anyway, in my view, AI today isn’t ready for full-fat use by engineers entering chip design for the first time (which is the category of people I’m addressing in this blog post). It’s ready to be “an assist” for answering questions that Google has decided they don’t want to be good at answering anymore.


Is it worth it?

This is a very subjective question. My HN comments contain some data and some of my thoughts on whether moving from software engineering into chip design worth is worthwhile, from perspectives of compensation, enjoyment, etc. Maybe at some point I’ll write a follow-up post on this topic if someone asks again.

My parting thought is this: It’s worth it if you enjoy it. This isn’t a career move you make just for compensation or ego.