July 17, 2024
Recording Badge Scans in Apache Pinot

About Pinot

Apache Pinot®  is a real-time distributed OLAP datastore. It’s designed to supply quick aggregation of metrics with giant information units and is optimized for quick ad-hoc queries. Pinot is column-oriented and is optimized for aggregating metrics. It’s designed to scale to trillions of rows and might ingest information in actual time.

About This Tutorial

On this publish, I’ll present you the way I constructed a badge scanner utilizing good playing cards that report each badge scan in Pinot. I’ll additionally present you find out how to question the information in Pinot to get insights into badge scans.

I used Kafka because the messaging system to ship badge-scan occasions to Pinot. I’ll present you find out how to arrange each Kafka and Pinot within the cloud. All the code, and so forth. required for that is obtainable on GitHub.


That is each a {hardware} and software program train, so you’ll need a little bit of a equipment to get this going. You will want: 

  1. A sensible-card reader (I like to recommend the ACR122U as a result of that’s what I utilized in creating this demo) 
  2. Some good playing cards (I like to recommend the NXP NTAG215 as a result of that’s what I utilized in creating this demo)

Some data of Go can also be useful, however not important. All the code for that is within the badge-reader repository on GitHub.

Organising the {Hardware}

The very first thing you want to do is to arrange the {hardware}. Right here’s what you want to arrange the ACR122U smart-card reader and the NXP NTAG215 good playing cards.

Organising the ACR122U

The ACR122U is a USB smart-card reader. It’s a fairly easy gadget to arrange. You simply must plug it into your laptop and set up the drivers. You’ll find the drivers on the ACR122U product page.

Notice: On macOS, you simply must plug the factor in. No drivers are wanted. On Home windows, you want to set up the drivers. On Linux, you want to set up the drivers and the libnfc library.

Organising the NXP NTAG215

There’s actually no setup for this. It’s only a card that has an NFC chip inside. It’s inert till you place it on the reader.

Organising the Software program

The following factor you want to do is to arrange the software program. Right here’s what you want to arrange the badge reader and the Kafka and Pinot servers.

Organising the Badge-Reader Software program

The badge-reader is a Go program that reads badge scans from the ACR122U and sends them to Kafka. It’s a fairly easy program. It simply reads the UID from the good card and sends it to Kafka. It runs in an infinite loop ready for a card to be positioned on the reader. When a card is positioned on the reader, it reads the UID from the cardboard and sends it to Kafka. It then waits for one more card to be positioned on the reader.

I created a knowledge kind for the cardboard info:

kind Card struct 
  Badge string `json:"badge"`
  Time  int64 `json:"time"`

The Badge subject is the UID of the cardboard. The Time subject is the time the cardboard was scanned, in milliseconds for the reason that epoch.

The principal operate is fairly brief, too:

func principal() {
    ctx, err := scard.EstablishContext()
    if err != nil 
    defer ctx.Launch()
    readers, err := ctx.ListReaders()
    if err != nil 
    if len(readers) > 0 
      index, err := waitUntilCardPresent(ctx, readers)
      if err != nil 
      card, err := ctx.Join(readers[index], scard.ShareExclusive, scard.ProtocolAny)
      if err != nil 
      command := []byte0xFF, 0xCA, 0x00, 0x00, 0x00
      rsp, err := card.Transmit(command)
      if err != nil 
      uidHex := hex.EncodeToString(rsp)
      fmt.Println("Card UID:", uidHex)
      newCard := CardBadge: uidHex, Time: time.Now().UnixMilli()
      message, err := json.Marshal(newCard)
      if err != nil 
      fmt.Println("Message:", message)
      err = sendToKafka("badges", string(message))
      if err != nil 
        _, err := waitUntilCardGone(ctx, readers)
        if err != nil 
        card = nil
        ctx = nil

I run an infinite loop that connects to the ACR122U and waits for a card to be positioned on the reader. When a card is positioned on the reader, it reads the UID from the cardboard and sends it to Kafka. It then waits for the cardboard to be faraway from the reader.

I put the logic for ready for a card to be placed on the reader and ready for a card to be faraway from the reader in separate features to maintain issues so simple as attainable. The waitUntilCardPresent operate is fairly easy:

func waitUntilCardPresent(ctx *scard.Context, readers []string) (int, error) 
  rs := make([]scard.ReaderState, len(readers))
  for i := vary rs 
    rs[i].Reader = readers[i]
    rs[i].CurrentState = scard.StateUnaware
    for i := vary rs 
      if rs[i].EventState&scard.StatePresent != 0 
        return i, nil
      rs[i].CurrentState = rs[i].EventState
    err := ctx.GetStatusChange(rs, -1)
    if err != nil 
      return -1, err

It’s one other infinite loop ready for the cardboard standing to vary. When the cardboard standing adjustments, it returns the index of the reader that the cardboard was positioned on. When a card is positioned on the reader, the standing adjustments to CardPresent. When the cardboard is faraway from the reader, the CardPresent standing is changed by a 0 standing so the waitforCardGone operate waits for the standing to vary to 0 earlier than it returns.

Lastly, there’s a sendToKafka operate that sends a message to Kafka:

func sendToKafka(matter string, message string) error {
  configFile := "./properties"
  conf := ReadConfig(configFile)
  p, err := kafka.NewProducer(&conf)
  if err != nil 
    return fmt.Errorf("error creating producer: %w", err)
    TopicPartition: kafka.TopicPartition
      Matter:     &matter,
      Partition: kafka.PartitionAny,
      Worth: []byte(message),
  , nil)
  go func() 
    for e := vary p.Occasions() 
      swap ev := e.(kind) 
        case *kafka.Message:
          if ev.TopicPartition.Error != nil 
            fmt.Printf("Didn't ship message: %vn", ev.TopicPartition)
            fmt.Printf("Produced occasion to matter %s: key = %s worth = %sn",
              *ev.TopicPartition.Matter, string(ev.Key), string(ev.Worth))
  p.Flush(15 * 1000)
  return nil

That’s it. Await a card, learn the UID, ship the UID to Kafka, look forward to the cardboard to be eliminated, after which begin over. That is the type of card reader that you could possibly use to verify folks right into a convention or a celebration, or scan merchandise as it’s stocked, or offered.

It’s additionally attainable to learn extra information from the cardboard whether it is current, however for now, I’m simply studying the UID. Some purposes would possibly need to learn the identify, or job title, from a badge however all NFC playing cards include a novel ID, so we are going to learn that and correlate the badge ID with a consumer within the database.

Organising the Kafka Server

To make issues simple I used Confluent’s Cloud service to arrange a Kafka server. I created a free account after which created a brand new cluster. I then created a brand new matter known as badges and set the retention interval to 1 day. I additionally created a brand new API key and secret which I used to create a properties file to hook up with the Kafka server.

# Required connection configs for Kafka producer, shopper, and admin
bootstrap.servers=<your cloud server>:9092
sasl.username=<your api key>
sasl.password=<your api secret>

# Greatest follow for larger availability in librdkafka shoppers previous to 1.7

Working the Program

I ran this system on my laptop computer and I used to be capable of scan my badge and see the message within the Kafka matter.

$ go run principal.go
Card UID: 045e0f7fdf61809000
Message: [123 34 98 97 100 103 101 34 58 34 48 52 53 101 48 102 55 102 100 102 54 49 56 48 57 48 48 48 34 44 34 116 105 109 101 34 58 49 54 56 49 51 51 52 49 50 48 50 48 53 125]
Produced occasion to matter badges: worth = "badge":"045e0f7fdf61809000","time":1681334120205

Studying the Messages From Kafka Into StarTree Cloud

I created a brand new StarTree Cloud challenge after which went into Information Supervisor to create a brand new information supply. I chosen Kafka as the information supply kind after which entered the Kafka server info and the subject identify.

Data Manager: Create new data source

When you click on New Connection you may be requested to enter your credentials in your Kafka server. I used the API key and secret that I created earlier and that’s in my properties file referenced above.

New Connection: Enter your credentials for your Kafka server

The following display screen will ask you to call your dataset and add an outline, after which it should try to start out retrieving information from Kafka.

Data Manager: Create dataset

If all the pieces is working accurately it’s best to see information present up whenever you click on the Verify Pattern Information button. 

Check sample data

As soon as I hook up with my Kafka dealer, I can see a number of badge numbers and timestamps that have been already within the dealer.

Lastly, you may click on by to Create Dataset and add your new dataset. You’ll then see the Pinot desk config and schema displayed.

Create dataset

As soon as the information supply is energetic you may go to the Information Explorer and begin exploring your information.

Data Explorer

To this point there’s not a lot information in right here, and it might not look solely helpful to have this, however in case you observe this collection of posts you will note how I take advantage of this information, at the side of different information, to construct one thing fairly helpful. 


I hope you loved this publish. I had plenty of enjoyable constructing this challenge and I’m trying ahead to constructing extra tasks with the ACR122U and StarTree Cloud. When you’ve got any questions or feedback, please depart them beneath.