Shared insights
on
KafkaKafkaHadoopHadoop
at

The early data ingestion pipeline at Pinterest used Kafka as the central message transporter, with the app servers writing messages directly to Kafka, which then uploaded log files to S3.

For databases, a custom Hadoop streamer pulled database data and wrote it to S3.

Challenges cited for this infrastructure included high operational overhead, as well as potential data loss occurring when Kafka broker outages led to an overflow of in-memory message buffering.

READ LESS
Scalable and reliable data ingestion at Pinterest - Pinterest Engineering - Medium (medium.com)
11 upvotes·613.7K views
Avatar of StackShare Editors