by quipo

quipo / kafka-php

PHP client library for Apache Kafka

199 Stars 66 Forks Last release: about 2 years ago (v1.0.0) Apache License 2.0 35 Commits 2 Releases

Available items

No Items, yet!

The developer of this repository has not created any items for sale yet. Need a bug fixed? Help with integration? A different license? Create a request here:


kafka-php allows you to produce messages to the Apache Kafka distributed publish/subscribe messaging service.


  • Minimum PHP version: 5.3.3.
  • Apache Kafka 0.6.x or 0.7.x.
  • You need to have access to your Kafka instance and be able to connect through TCP. You can obtain a copy and instructions on how to setup kafka at https://github.com/kafka-dev/kafka
  • The PHP Zookeeper extension is required if you want to use the Zookeeper-based consumer.


Add the lib directory to the PHP include_path and use an autoloader like the one in the examples directory (the code follows the PEAR/Zend one-class-per-file convention).


The examples directory contains an example of a Producer and a simple Consumer, and an example of the Zookeeper-based Consumer.

Example Producer:

$producer = new Kafka_Producer('localhost', 9092, Kafka_Encoder::COMPRESSION_NONE);
$messages = array('some', 'messages', 'here');
$topic = 'test';
$bytes = $producer->send($messages, $topic);

Example Consumer:

$topic         = 'test';
$partition     = 0;
$offset        = 0;
$maxSize       = 1000000;
$socketTimeout = 5;

while (true) { $consumer = new Kafka_SimpleConsumer('localhost', 9092, $socketTimeout, $maxSize); $fetchRequest = new Kafka_FetchRequest($topic, $partition, $offset, $maxSize); $messages = $consumer->fetch($fetchRequest); foreach ($messages as $msg) { echo "\nMessage: " . $msg->payload(); } //advance the offset after consuming each MessageSet $offset += $messages->validBytes(); unset($fetchRequest); }


  • support for Snappy compression

Contact for questions

Lorenzo Alberton

l.alberton at(@) quipo.it



We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.