BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage Presentations Tiny Go: Small Is Going Big

Tiny Go: Small Is Going Big

Bookmarks
49:01

Summary

Ron Evans talks about TinyGo - a compiler for Go, written in Go itself, that uses LLVM to achieve very small, fast, and concurrent binaries that can also target devices where Go could never go before. The talk includes live coding of devices, RISC-V, WebAssembly, and a drone, to show some of what can be done today using TinyGo.

Bio

Ron Evans is an open source software developer, businessperson, author, and speaker.

About the conference

Software is changing the world. QCon empowers software development by facilitating the spread of knowledge and innovation in the developer community. A practitioner-driven conference, QCon is designed for technical team leads, architects, engineering directors, and project managers who influence innovation in their teams.

Transcript

Evans R.: I am @deadprogram. In the real world, people call me Ron. I am @deadprogram on the internets: Twitter, GitHub, all the places. I'm a technologist for hire.

Evans S.: I am Salvador Evans. I am son of @deadprogram, or his apprentice.

Evans R.: We run a small consultancy called The Hybrid Group where we specialize in writing software for hardware companies, where we're all technologists for hire. We've done work for many different clients building the software for their hardware. Recently, we've been doing a lot of work for a really cool company called Northvolt, which is Europe's only lithium ion battery manufacturer. A company based in Sweden, doing all the batteries for the new BMWs. We've done a lot of work for a little company called Intel. We've also along the way created quite a few different open-source projects in particular in the Go programming language. Gobot is very well known among people who create middleware, the middle layer that runs on embedded Linux. GoCV, which is computer vision in Go. It's a set of wrappers around OpenCV, which is a computer vision framework, very popular and commonly used.

TinyGo

We're here to talk about TinyGo, which is now officially sponsored by Google. Go is really big in the cloud by which we mean Go is big. Go executable programs are very large in Hello World in Go 1.13, which is the release right before the most recent release. Go programs are very large. That's not a problem because clouds are infinitely scalable. What about the small places? What about the little places by which we mean microcontrollers, which are the small chips that actually run all the things in the world you care about, like the dialysis machine, or the brake systems, or other types of mission-critical, the refrigerator, and WebAssembly. What we're talking about is true Edge computing, not this so-called Edge computing. Edge computing is not the data center nearest you. I'm sorry. I must respectfully disagree. The real Edge computing is in what we call the last centimeter. That's what TinyGo is here for.

How TinyGo works is it's a trinity between Go, TinyGo, and LLVM. The Go compiler itself is written in Go. All cool languages eventually reach a point where they can be compiled in themselves. It's very self-referential, but it also proves that it works. The Go compiler tool chain is written in Go and a lot of the things that it uses are actually built into Go standard library. Then you've got LLVM. It's a framework for building compilers. It's being used by a couple of languages, one of them is a very cool language called Rust. We really admire the Rust community. Also, Swift, another very cool language. Some other cool languages recently Zig, written in LLVM language. The way that TinyGo works is we take the source code of Go and we parse it through the Go standard libraries to create the Go single static assignment form, which is taking the Go code and reducing it down to this very reduced type of syntax. Then we take that and TinyGo translates that into LLVM intermediate representation, which is what LLVM's tool chain takes. Then using tools like Clang or LLD, which are the built-in tool chain that LLVM itself provides, we can then compile targets in LVM for very small places.

The Hello World of Things

Let's start with the Hello World of things, which is, of course, a blinking LED. For this, we're going to use a Digispark. The Digispark is an ATtiny85. It is an 8-bit processor. That big thing on there is not the ATtiny. That's the power regulator so we can plug it into your USB. That is the ATtiny, that other little one next to it. The ATtiny has only 8K of memory and it's an 8-bit processor. Of all of our pleasures, we're going to start with the gentlest. Let's take a quick look at the code for the Hello World of things. Who here does not know Go at all? Go is very similar to languages that came before. There's many things that you will recognize. In Go, we have the package main, which is the main executable unit. We can import external libraries similar to require in JavaScript or Ruby. We're going to import the machine package, which is part of how TinyGo talks to hardware. We're going to import the time package, which is one of the Go standard libraries for managing time. Then we'll declare our function main just like Python, or C, or other languages. That's our entry point into the program. We're going to say LED:=machine.LED, := defines it, and also assigns to it. It says that this LED is of whatever type this is, machine.LED and assigns it. Then we're going to configure the LEDs. It's an output pin. When we're working with embedded devices, most pins are either output, making something happen, or input, reading something from the world. Then in Go, there is no While loop, there is only For. If we say For with no parameters, it means forever. Forever, we're going to LED.low. That sets it off, sets it to a low level in 5v TTL. Low 0 in binary syntax. We're going to time sleep for 500 milliseconds, one half of a second. Do nothing. Then we're going to LED.high, which turns on the LED 1 in binary. Then time sleep for another 500 milliseconds.

Let's run this. I'm going to run this program. It compiles the TinyGo program, which will know this is 398 Bytes in size. It's asked me to plug in the device. I'm going to plug in the Digispark into my USB port so we can flash it. It erases it and now it's flashed. If I have a battery handy, and I plug it in. Here's a battery. This was another classic case of the battery is a lot larger than the device. If I go to the camera, you will see that the LED is flashing and there's no wires attached. Nothing up my sleeves, folks.

Let's see some basic input and output. For this, we're going to use a Digispark again, with an LED and a button. Same Digispark, same 8-bit processor. This program is very similar, package main. We import the same machine and time packages. We have our LED that we declare. We also say button:=machine.p0, which is the pin that the button is plugged into. We configure that as an input pin. That way we can get the input, then For ever, if button.get, which gets the value either true or false, if it's pushed, then we turn on. Otherwise, we turn off and then we wait 10 milliseconds to see what happens there. Let's make button. That one is 206 Bytes. How is it smaller? We keep getting smaller. Eventually, we're at negative bytes. We plug this into our battery again. We go over to our camera. As I push the button, you see the LED turns on, Morse code. I'm sending a message to the past that says please stop using carbon and helium. I need that helium for my blimp. TinyGo blink with button.

How TinyGo Knows About Hardware

How does TinyGo actually know about hardware? That was neat. "You showed me 20 lines of code and it did something." How does that actually work? For that we have to go way inside the chip. Deeply inside. All the way inside to registers. You may have heard of registers. Somebody said something about the assembly language, registers. I like to think about registers like little valves. You turn them on and off, and then they make the electrical circuit work, like the water flowing through a series of pipes, a series of tubes, like the internets themselves. When we have a microcontroller, especially the ones that are created by all the different ARM licensees, they actually provide a description of all the different capabilities built in in a file that's known as a System View Description file or an SVD file. Here's a tiny excerpt from one small part of one SVD file. You'll notice that it's in XML, one of our favorite markup languages. This is describing the port. The port is how we are able to turn the Blinky on and off, or read the buttons. It tells us which address the offsets that has that. All this information at a very low level. We take those SVD files, and we use them to generate Go code. We read them and we actually parse them, and generate Go wrappers for them. Something which is also done by a Rust embedded. It is also done by Python. With other languages, we're just using the same techniques.

We create the device package. The device package takes that and it turns it into a Go type, where now we have the direction, whether it should be cleared or set. This itself is very low level as well. It's very hard to work with these individual bits and individual registers, even if it's in a higher level language like Go. Here's an example of where we're setting the output port of that one bit, bit three to set. That's how we're actually turning the Blinky LED on and off on that Digispark. That's a little too low level. Instead, we have the machine package on top of the device package. The machine package is what our hardware abstraction layer is. This is how the set implementation works. We look at the pin number. If it's less than 32, and you're turning it on, then we're able to set that pin on, otherwise we clear the pin. It's still pretty low level, but at least it gives us the ability to say pin.setoff, or pin.seton. It knows exactly which register and which bit within that register, and that's all taken care of for you. We have wrappers around the typical peripheral, input output, general purpose input output, or GPIO. This is the code you saw before. LED machine.LED, configure it as an output pin and set it to true. We also have support for UART, which is Universal Asynchronous Receive Transmit. SPI, which is Serial Peripheral Interface. I2C, is the inter-inter chip communication, that's how all the chips on the single board can communicate with each other.

High-Speed LED Strips

Let's see an example that puts this all together and let's use our Digispark one more time. We're going to use the Digispark with a strip of NeoPixel LEDs. These LEDs are actually each individual microcontrollers themselves that do nothing more than control the RGB LED, and each one of them is a separate microcontroller. We're plugged in to our Digispark. It's going to, at very high speed, send these signals to the LEDs. The way that this works, is thanks to the TinyGo drivers. The TinyGo drivers are a library of all of the different types of devices that we support: digital thermometers, accelerometers, different lighting controllers. I think we have about 40 different devices currently supported by the community with more being added all the time. We're not going to go over every single bit of code, but just to give you an idea, same package main. We're going to import the Go standard image color package. That way, we're just using normal Go code for color assignment. Machine and time, our friends from before. Then the WS2812. That's the real chip name for the Adafruit NeoPixels. NeoPixel is a much cooler branding title. You can use random LEDs that you've acquired through various sources as well, they will probably work. We have 10, because we have 10 different LEDs. We've got our LED. We've got our NeoPixels on pin 0. We say WS2812.New in Go. Sometimes we'll have a constructor. It's going to return one of these. It's going to assign it to that pin, pin 0. Then, For ever, we're going to go through and set these LEDs to each one, on and off. For this one, it's going to be red and green, Christmasy theme. We're going to write those colors. Then we're going to sleep for 100 milliseconds.

Demo 1

If we make NeoPixels, this is a whole 728 Bytes. Actually, before I flash it, one really key thing to know is the only way to get this high speed performance is you need to use assembly language. You need to use assembly language in any programming language you program these NeoPixel lights in because nothing else is fast enough to send the updates, especially if you end up with hundreds of them in a string. One of the things about TinyGo is we can actually compile assembly language code right into your TinyGo code. In this case, we're using AVR assembly language. This is just a small excerpt of it. This is how it's able to combine the Go code. The Go code figures out which is the pins to set. Then the assembly language actually does the setting. Let's compile that program. Let's plug in the NeoPixel through the Digispark. If you've plugged it in the right direction, it works.

Bouncing ball using the PyPortal. That's the PyPortal. The PyPortal is Adafruit. It uses the microchip ATSAMD51, which is a 32-bit ARM Cortex M4, we have 120 Megahertz using 256K of RAM. That's pretty large. Lots of headroom for us in TinyGo land. It also has a built-in WiFi chip, the u-blox NINA-W102. It's basically the same as the ESP32, which is the Espressif WiFi chip that is really commonly used in a lot of different devices. It runs at 240 Megahertz. You need to run a lot faster processing speed just to do SSL, basically. That's the reason for that. You can find all the code here on the TinyGo driver's website.

This is the PyPortal. It's got the WiFi chip. It's got room for an SD card. It's got a few different sensor buttons and connectors that we can plug in. Make ball. We plug it in. This is a bouncing ball demo. Let's flash it. We have different flashing software that is currently supported by TinyGo. This is a whole 27K program that's really large. What it does is it shows us a bouncing ball on the screen. There we go. That's it. A really cool demo. No, it sucks. Here's why. As cool as this is, we're not even using a fraction of the processing power in this 32-bit microcontroller. I know you're like, it's bouncing around. It's like physics. No, we can go so much further than this.

Let's take a look. It's time for the LED cube. The LED cube is the Cube of Fire. It's an ItsyBitsy M4 with six little tiny LED displays, using a board called the HUB75. The HUB75 is what's used on these giant stadium-size lighting controllers, they just have many of them all stacked together. We're going to use six of them to make this cube. It uses the ItsyBitsy M4, which is an ARM Cortex M4 32-bit processor. This has 192K of RAM, so even tinier, and the HUB75 LED panels. In order to achieve this, we're going to need to use direct memory access. Direct memory access is a way that we can do very fast input output, I/O, by letting the actual I/O devices talk directly to the memory, leaving your microcontroller free to do other things. We're going to use interrupts. Interrupts in embedded software and any software, really, are when something happens that has to be handled right now, it can send a signal to the main processor called an interrupt. That interrupt tells you, "Stop what you're doing right this second and do this other thing," shut down the battery before the meltdown. Or, in our case, we want to be able to update the display while we're rendering what's happening on the next.

Here's an example using the UART, where we set our interrupt. We declare a new interrupt in Go again, saying when that interrupt occurs, call the handle interrupt function. We set its priority to C0, so not super high priority. We enable it. The function in Go that we call when that interrupt is triggered, says, get from the register the value that's just been typed by the user on their keyboard that's connected to the UART. Then let the rest of the code know about it. If you want to see the code for the Cube of Fire. The code is currently on my collaborator, Aykevl personal.

It makes no sense to be called the Cube of Fire because we don't have any fire. Let's fix that right now. Let's take a quick look. Let's use the fire demo. If we make cube. Let's plug in. Also, it wouldn't be on. It compiles 21K of code. The Cube of Fire. You don't see anything on top because fire goes up from the bottom to the top. We have a whole physics simulation of fire running on this 32-bit ARM microcontroller doing all of this stuff at a pretty decent frame rate. As promised, the Cube of Fire.

A bunch of you have seen us wearing these badges around today. These badges are also written using TinyGo. I like to fly things. I also like programmable badges. I thought the only thing better than a programmable badge is one that we use to fly a drone, just because that's how we do it. We're going to plug in this joystick into the ports on the badge, conveniently provided. This is a Go badge. A Go badge is actually a PyBadge from Adafruit with a Go Bridge sticker on it for now, but we'll fix that. I'm going to plug in the badge. Then we'll take a quick look at some code here. It's using the PyBadge, which is another ARM Cortex M4 32-bit running at 120 Megahertz, that's the chip de jure. We're going to communicate using the USB Communications Device Class, or CDC. This is how we can make a device use USB and act like a serial port. This is the architecture of it. We have the PyBadge. It reads in the buttons using the GPIO interface. Reads the joystick using the analog I/O. Then it's going to send updates to the little TFT display that's built in and send the information about the drone control to the USB. We have our devices from the TinyGo drivers, the display, and the buttons. We have our spy interface, which is how we communicate with the display. You'll notice here that we have a Goroutine, Go handle display. TinyGo is just Go. We don't consider ourselves separate from Go. We have the concurrency capabilities of Go. When we say Go handle display, it runs this function as a Goroutine and then immediately returns while that function is still executing by itself. We're going to initialize the analog to digital converter. We read the joystick and the buttons. Then based on that, we're just going to send that information out to the serial port. This is the function that's running in the Goroutine, which is displaying on that little display, the buttons that we push and the readings from the joystick. Let's flash it. Fly badge. There we go. Let's flash it, 19K program. I got it correct, the x and the y axis. When I push the buttons over here, that's going to be takeoff. Push the battery in. We see the Go badge, the flight badge as it were.

The full application is actual flight control. For this, we're going to have our flight stick that we just programmed. A ground control system running on my notebook that actually uses computer vision to do deep neural network recognition of people's faces. A Tello drone, which is that drone. An MQTT server which is Mosquitto, which is an open-source machine-to-machine messaging broker currently maintained by the Eclipse Foundation. This is the Tello. The Tello is a very cool drone. It's made by DJI, the Chinese drone company. It actually uses in it a special chip called the Movidius X. The Movidius X is a chip made by a company that Intel bought called Movidius that is a specialized chip intended to do deep neural networks and other types of concurrent computing. It just so happens that that is also controlling the flight system in this drone. We're going to bring in Gobot. Gobot is what's running on my notebook computer. That's our middleware. The DJI drone, it's a Movidius 2. We're also going to use GoCV to do the computer vision part. Here's a quick look at the architecture. We're going to use the PyBadge as our flight stick. The ground control system is going to read that information and use it to send commands both to the drone and to our machine-to-machine messaging broker. Then using the UDP packets of the video we're going to send that through our computer vision. The two parts that are most interesting is when we set up the drone that we're actually going to grab that video. That as we get the flight data events, we're going to post those to our MQTT machine-to-machine messaging server. Let's see if we have everything plugged in. Yes. Let's see if I'm connected to it. No. It should connect to it. It did. I have a bad key in my notebook.

Before we do anything, let's go to our machine-to-machine messaging server, Mosquitto, and we're going to subscribe to these messages that will theoretically be coming from the drone. Then we're going to go back to run our demo. We make Tiny drone. I should be holding the joystick at the time probably when it takes off, don't you think? Not a bad idea. It's connected. We'll know it's working because we should see a little video window pop up, hopefully. Here it is. It's looking at people's feet right now. It's searching for humans. We'll give it what it wants in a minute. It only has about maybe 10 minute battery life, so you just have to run faster than your friend for about 7 minutes. Let's jump over real quick so we can see the MQTT messages. Very good. This other camera view. Maybe it doesn't like them on top of each other. I'm not seeing any video updates coming, makes me wonder if it's getting updated. Let's shut down the other video. Something went wrong. Nothing yet. It was running until I changed that window. Maybe the drone fell asleep. It keeps telling me that. No, it's still awake. It's definitely looking for something. We got to boot the drone up again. Turn the power on. Let's take a look here. It should only take a minute. This time we're ready. Let's just go directly to takeoff.

There is a human now. One really important thing about drones is that you need to be able to do a flip. A flip is to the drone like a lens flare is to a photographer. It's overused. I'm ok with it. I guess it's time to bring it in for landing. Wait, there's one really important thing we forgot. Actually, the thing I did want to do real fast is show you that it actually is getting its MQTT messages. We're at 65% battery and there's no humans in the picture. Yes, there are. It's got humans, true and false at the bottom. It's detected a few of you. We should bring it in for landing now before we actually do some damage. There we have it.

The Future of Edge Computing

Let's briefly talk about modern compilation targets. In actuality, the future of Edge computing has already begun. There's a lot of people who've been working on it for quite a while. We are just the latest to pick up the Cube of Fire and carry it along in our own little ways. The future's future is about RISC-V. It's about WebAssembly. The only way to make it better is if we combined both of those into a single demo. The demo that brings all the demos together. We have something called the TinyGo Playground. The TinyGo Playground is at play.tinygo.org. Similar to the Go playground, it lets you compile Go code in the cloud. The TinyGo Playground lets you compile TinyGo. The TinyGo Playground. This is SiFive HiFive1 B board. That is a board that is made by a company called SiFive based in California and China. It uses the Freedom E310 processor, which is a RISC-V processor. It runs at 320 Megahertz, and has 4 Megabytes of flash.

Demo 2

For this demo, I'm going to program it with a relatively straightforward Blinky program. It's plugged in. You'll notice that it's got a little simulator where it's blinking the LED. This code is running entirely in WebAssembly. What's happening is the Go program here, or play.tinygo.org uses the TinyGo compiler running in the cloud to compile the RISC-V code to WebAssembly and then executes it in the browser. That way, we can create a simulator, which is not a simulator at all, which is actually the same code that we're going to run on the device. Don't believe me? Let's download by clicking this flash button. There is one. If I take this firmware file, and I drag it onto the HiFive board, it will flash that board and then reboot it. You'll notice that it's cycling through the LEDs. I showed you the code before, it's going red, green, blue. We've actually just written a TinyGo program, compiled it using play.tinygo.org in the cloud, downloaded that HEX file from the internet and flashed it right onto this board right here right now. The future is already here. Even the future's future is here. It's amazing. Tinygo.org, that's our website. You'll find lots of information. TinyGo itself is licensed under the BSD-2 license just like Go itself, so you can use it for commercial purposes or anything else.

Questions and Answers

Participant 1: How's the real-time aspect of this? Go wasn't written to be real-time. Talk to the LED strip and all that, if garbage collection happens during that strip update, it will fail. How's that working out for you?

Evans R.: It's worked out really well. How do you do normal Go type things like garbage collection and concurrency? We have our own runtime that we have written. It's the Go programming language, but with a different runtime, because if you think about the way a Go program normally executes, it's running on some type of operating system: Linux, macOS, Windows, FreeBSD. That operating system provides some lower level capabilities for talking to the physical devices in Linux. They might be device drivers. With TinyGo, we're using LLVM. We're compiling our TinyGo code to this very compact code. Because we've written our own runtime, we actually have multiple garbage collection implementations written not all at the same time. You can choose which garbage collector you'd like to use, depending on your particular use case, you might choose to use the conservative garbage collector, which basically, when it runs out of memory will do garbage collection. We have that extreme to the opposite extreme, which is no garbage collection, which means when you try to compile your code, it will panic. It's totally possible to write Go code without using make, without allocating new memory, just basically using static memory assignment. That might be a case where you want to use it.

When we go to using LLVM, our first implementation of Goroutines used LLVM coroutines. LLVM coroutines are concurrency primitives based in LLVM. The problem with the LLVM coroutines is they have no knowledge of when they're executing. They just pause anywhere inside the implementation of the coroutine in order to let other coroutines have a chance to execute. That's not very efficient. It also doesn't let you handle situations like if you want to turn on or off the LED strip at the same time. We then implemented our own task scheduler, so that way, we have more control over the times when this garbage collection occurs. As you saw with the LED cube, we can get some fairly substantial performance. The LED cube does not use assembly language. This LED cube uses the DMA and the IRQs. The Digispark uses assembly language because there's no other way to do that on a minimalist microcontroller. In fact, if you look at the implementations in Python, in JavaScript, every language, even Arduino's implementation, which is Arduino C++, they still use assembly language embedded in to talk to these particular LEDs.

Participant 2: What's the secret of the code being so small because, obviously, that's a big problem with Go? Where are you saving this space?

Evans R.: How are you saving space when Go programs themselves are so large? It goes back to the fact that we have implemented our own runtime. We have to implement a runtime that expects no operating system at all. When you get rid of most of the runtime of Go, a Go program itself, it's actually quite small. This is a key benefit if you're writing Go for servers, because server deployment is quite simple. It's statically linked. A single executable contains everything you need. Great for cloud deployments. We have the exact same thing for embedded device deployments. The difference being that we don't bring along the entire runtime, only the things that we actually need.

 

See more presentations with transcripts

 

Recorded at:

Aug 06, 2020

BT