Close

Is it possible to drive Titan Micro display controllers with an I2C bus?

ken-yapKen Yap wrote 04/14/2022 at 18:00 • 5 min read • Like

Many people have noticed that resemblance to I2C protocol of the protocol used by the Titan Micro series of LED display controller chips, of which the TM1637 is most often seen, on display modules that cost only a dollar or less, not much more than the parts that go into it. These modules are useful for simplifying projects and reducing the number of drive lines needed of the MCU. I myself have written a comparison of the two protocols, I2C and let's call the other TMP.

I saw a suggestion on EEVBlog that TM chips could be driven by a standard I2C bus. Was this a tested idea or speculation, I wondered. I decided to test the idea.

While the low level details of the start and stop conditions are the same, two other aspects need to be handled:

  1. The bit order in the stream is most significant bit first for I2C and least significant bit first for TMP. This requires that all commands and data to the chip be byte-reversed before use. This isn't as onerous as it sounds. Constants for commands can be reversed in program code. For the display data which is generated from a font table, the table entries can be reversed at coding time. Or one could regard the segment to bit mappings to be reversed. In the program below I have elected to do it with a function to keep things simple.
  2. TMP doesn't have a slave address, the first byte after the start is a command. So the command would go where the I2C slave address would go. The complication is that in I2C the LSb of the address is the direction bit, 0 for master to slave and 1 for slave to master. We are interested in the former. That means that the LSb has to be 0. Taking into account the byte reversal, this means that only TMP commands below 0x80 can be sent. The silicon is entitled to look at the direction bit and not transmit to the slave if it's not 0.

The previous explains why the following Arduino sketch for scrolling digits doesn't work as expected. The LED display doesn't turn on, and only running a working bit-banging sketch before running this one will the LEDs light up.

The set address command of TMP also doesn't work as it's 0xCn where n is 0..5. At power up n defaults to 0, so all 6 bytes of the data have to be sent otherwise not all the digits can be seen.

// Uno
#define SDA     A4
#define SCL     A5

#include <Wire.h>

static uint8_t startnum = 1;
static uint8_t display[6];

static const uint8_t font[] = { 0x3f, 0x06, 0x5b, 0x4f, 0x66, 0x6d, 0x7d, 0x07, 0x7f, 0x6f };

static const uint8_t revmap[] = { 0x0, 0x8, 0x4, 0xc, 0x2, 0xa, 0x6, 0xe, 0x1, 0x9, 0x5, 0xd, 0x3, 0xb, 0x7, 0xf };

uint8_t byterev(uint8_t b)
{
  uint8_t lo = b & 0xf;
  uint8_t hi = b >> 4;
  return revmap[lo] << 4 | revmap[hi];
}

void setdisplay()
{
  Wire.beginTransmission(byterev(0x8f >> 1));  // set brightness (doesn't work)
  Wire.endTransmission();
}

void writedigits()
{
  Wire.beginTransmission(byterev(0x40 >> 1));   // data command (works)
  Wire.endTransmission();
  Wire.beginTransmission(byterev(0xc0 >> 1));   // set address (doesn't work)
  Wire.write(display, sizeof(display));
  Wire.endTransmission();
}

void setup()
{
  pinMode(LED_BUILTIN, OUTPUT);
  pinMode(SDA, OUTPUT);
  pinMode(SCL, OUTPUT);
  Wire.begin();
  Wire.setClock(100000);
  setdisplay();
}

void loop()
{
  uint8_t first = startnum;
  for (uint8_t i = 0; i < sizeof(display); i++) {
    display[i] = byterev(font[first]);
    first++;
    if (first >= 10)
      first = 0;
  }
  writedigits();
  delay(500);
  startnum++;
  if (startnum >= 10)
    startnum = 0;
  digitalWrite(LED_BUILTIN, (startnum & 0x1) ? HIGH : LOW);  // flash at 0.5 Hz to debug
}

The video at the top shows the scrolling one should expect to see, albeit on a ESP8266 platform instead of Arduino.

But the send data command does work as it's 0x40. So one solution would be to use bit-banging to initialise the chip, then switch to using the silicon support for I2C for display updates. It would mean slightly more code, but not much for the display update. Here's a slightly modified version of the previous sketch where bitbanging is used to initialise the display, i.e. set the brightness to maximum, and then use I2C routines to put out the display data.

// Uno
#define SDA     A4
#define SCL     A5

#include <Wire.h>

static uint8_t startnum = 1;
static uint8_t display[6];

static const uint8_t font[] = { 0x3f, 0x06, 0x5b, 0x4f, 0x66, 0x6d, 0x7d, 0x07, 0x7f, 0x6f };

static const uint8_t revmap[] = { 0x0, 0x8, 0x4, 0xc, 0x2, 0xa, 0x6, 0xe, 0x1, 0x9, 0x5, 0xd, 0x3, 0xb, 0x7, 0xf };

uint8_t byterev(uint8_t b)
{
  uint8_t lo = b & 0xf;
  uint8_t hi = b >> 4;
  return revmap[lo] << 4 | revmap[hi];
}

void start(void)
{
  digitalWrite(SCL, HIGH); //send start signal to TM1637
  digitalWrite(SDA, HIGH);
  delayMicroseconds(5);
  digitalWrite(SDA, LOW);
  digitalWrite(SCL, LOW);
  delayMicroseconds(5);
}

void stop(void)
{
  digitalWrite(SCL, LOW);
  digitalWrite(SDA, LOW);
  delayMicroseconds(5);
  digitalWrite(SCL, HIGH);
  digitalWrite(SDA, HIGH);
  delayMicroseconds(5);
}

bool writevalue(uint8_t value)
{
  for (unsigned int mask = 0x1; mask < 0x100; mask <<= 1)
  {
    digitalWrite(SCL, LOW);
    delayMicroseconds(5);
    digitalWrite(SDA, (value & mask) ? HIGH : LOW);
    delayMicroseconds(5);
    digitalWrite(SCL, HIGH);
    delayMicroseconds(5);
  }
  // wait for ACK
  digitalWrite(SCL, LOW);
  delayMicroseconds(5);
  pinMode(SDA, INPUT);
  digitalWrite(SCL, HIGH);
  delayMicroseconds(5);
  bool ack = digitalRead(SDA) == 0;
  pinMode(SDA, OUTPUT);
  return ack;
}

void setdisplay()
{
  start();
  (void)writevalue(0x8f);       // for changing the brightness (0x88-dim 0x8f-bright)
  stop();
}

void writedigits()
{
  Wire.beginTransmission(byterev(0x40 >> 1));   // data command (works)
  Wire.endTransmission();
  Wire.beginTransmission(byterev(0xc0 >> 1));   // set address (doesn't work, ignored)
  Wire.write(display, sizeof(display));
  Wire.endTransmission();
}

void setup()
{
  pinMode(LED_BUILTIN, OUTPUT);
  pinMode(SDA, OUTPUT);
  pinMode(SCL, OUTPUT);
  setdisplay();
  Wire.begin();
  Wire.setClock(100000);
}

void loop()
{
  uint8_t first = startnum;
  for (uint8_t i = 0; i < sizeof(display); i++) {
    display[i] = byterev(font[first]);
    first++;
    if (first >= 10)
      first = 0;
  }
  writedigits();
  delay(500);
  startnum++;
  if (startnum >= 10)
    startnum = 0;
  digitalWrite(LED_BUILTIN, (startnum & 0x1) ? HIGH : LOW);  // flash at 0.5 Hz to debug
}

So my conclusion is that suggestion only partly works.

What are the benefits?

Offloading the work to silicon means that the MCU isn't distracted for long periods while the bit-banging is executed. Instead of having to deal with the output line every bit, the MCU loads up the output register every byte. Displaying a buffer could even be handled by an interrupt routine which would be triggered by a buffer update. Finally the I2C silicon takes care of the protocol timing and we don't have to measure delays in the bit-banging.

Is it worth it? Unless the bit-banging interferes with a time-critical task that the MCU runs and given that 7-segment displays are viewed by humans and update periods are human-scale, this technique is limited in application.

Like

Discussions