Designing digital musical synthesizer on Zedboard

Yuhei Horibe
4 min readNov 10, 2020

In this section, both hardware and software implementation of Zedboard digital musical synthesizer will be explained. Each chapter (“Hardware Designing” and “Software Designing”) will have links to design overview, and details of each component design.

In this article, what are already existing, and what are newly added will be explained (scope of this personal project).

Hardware Designing

Figure 1 shows the hardware block diagram of the entire synthesizer hardware. Detail of the synthesizer hardware design is explained here.

Digital Synthesizer hardware Design

Figure 1. Digital Synthesizer Hardware Architecture

For this project, I chose Digilent Zedboard as the target board. Detail of the Zedboard is available below.

Digilent Zedboard

Zedboard is an SoC (System on Chip) evaluation board. More specifically, it has Xilinx Zynq-7020 SoC, which has 2 key components in it;

  • PS (Processing System)
  • PL (Programmable Logic)

PS has CPUs, which are capable of running Linux OS on it. The goal of this project is to play midi (Musical Instruments Digital Interface) file with the implemented synthesizer. This is one of the reason I chose this board, since I was planning to use existing libraries to handle midi events and audio control.

PL is basically an FPGA (Field Programmable Gate Array), which has interfaces with PS, and external I/Os. FPGA is a reconfigurable hardware, which means, we can create our own hardware design with HDL (Hardware Description Language), and the design can be synthesized/implemented/programmed to it with IDE called Vivado. Overview of HDL and basic digital hardware designing are explained in previous sections.

Digital Hardware Designing

What I designed for this personal project is hardware in PL (inside the blue box in Figure 1). This is the main scope of this personal project (hardware design part).

Output of designed component

Audio CODEC LSI: Analog Devices ADAU1761 is implemented on Zedboard, and it also has analog input/output physical connectors on it. So the audio output signal will be generated in PL, and goes to ADAU1761 via I2S interface. Other volume control, mixer control and so on will be handled by ALSA audio driver, and those control signal will be sent via I2C interface.

Input of designed component

MIDI file will be opened on Linux (running on PS), and those midi control signals will be sent via MMIO (Memory Mapped I/O) interface (AXI4 Lite) from PS. For midi control, we don’t need large bandwidth. To make the design simple, this device will be implemented as character device.

Main components in PL

  • Synthesizer block (original module, entirely written in Verilog)
  • AXI interconnect (Xilinx IP)
  • I2C interface (Xilinx IP)

Those 3 components are connected together in top design. Synthesizer block is implemented as custom IP. Detail of this synthesizer IP design is explained below.

Digital Synthesizer Hardware Design

Software Designing

Figure 2. Digital Synthesizer Software Architecture

As I mentioned in previous section, I will use existing libraries to handle

  • MIDI events
  • Audio controls, such as volume, mute, mixer control, and so on

Both are implemented in ALSA (Advanced Linux Sound Architecture) library. To utilize those existing code, software components below must be implemented in Linux kernel;

  • Audio driver
  • Kernel (midi) sequencer client driver

Both are instantiated in sound card driver. Audio driver part, is left most vertical line in Figure 2, which are mostly coloured in blue. MIDI driver part is centar vertical line in Figure 2, which are mostly coloured in green.

Usually, sound card on PC is connected via PCIe bus. But for SoCs, audio CODEC LSI is directly connected in most cases. So the driver category is different from usual sound card driver (this is important later). The driver for this hardware will be implemented as Sound SoC module.

Lastly, we need to control the “tone” of the instruments. There are several parameters to control the tone of the synthesizer. For this part, I used UIO (User I/O) device driver. What it basically does is, it exposes address maps directly to user program, and user program can access to the memory mapped I/O for this specific device directly. So the core of the device driver, and application can be implemented as user program. This is shown as right most vertical line in Figure 2, and coloured in orange.

Main software component

  • Audio driver
  • MIDI driver (Kernel sequencer client driver)
  • UIO driver + user program

Software design detail is explained below.

Device driver design for Zedboard synthesizer (coming soon)

--

--