Ignito

Ignito

Share this post

Ignito
Ignito
[System Design Tech Case Study Pulse #26] Processing 2 Billion Daily Queries : How Facebook Graph Search Actually Works
Ignito

[System Design Tech Case Study Pulse #26] Processing 2 Billion Daily Queries : How Facebook Graph Search Actually Works

With detailed explanation and flow chart....

Naina Chaturvedi's avatar
Naina Chaturvedi
Nov 05, 2024
∙ Paid
2

Share this post

Ignito
Ignito
[System Design Tech Case Study Pulse #26] Processing 2 Billion Daily Queries : How Facebook Graph Search Actually Works
3
Share

Hi All,

Facebook's Graph Search is capable of processing 2 billion daily queries using Unicorn, a custom-built inverted index and retrieval system, and Apache Thrift for efficient inter-service communication. This sophisticated system forms the backbone of Facebook's ability to provide real-time, personalized search results across its vast social graph.

Let me dive deep into how this system works, exploring the key components, technologies, and processes that enable such massive-scale, low-latency graph search capabilities. 

Learn how to Design Facebook Newsfeed

[System Design Tech Case Study Pulse #2] How Lyft Handles 2x Traffic Spikes during Peak Hours with Auto scaling Infrastructure..


System Overview 

Before we delve into the search architecture, let us look at some key metrics of Facebook's Graph Search system: 

- Daily search queries: 2 billion+ 

- Peak queries per second: 100,000+ 

- Indexed entities: Trillions (users, posts, pages, etc.) 

- Edge types in the graph: 100,000+ 

- Average query latency: < 100ms 

- Unicorn servers: 1000+ 

- Data centers: 10+ 

- Daily index updates: Billions 

- Query types supported: Keyword, structured, natural language 

- Languages supported: 100+ 

- System availability: 99.99% 

- Index size: Petabytes of data 

- Thrift RPC calls per query: 50+ on average 


How Real World Scalable Systems are Build — 200+ System Design Case Studies:

System Design Den : Must Know System Design Case Studies

[System Design Tech Case Study Pulse #15] 80 Million Photos Daily : How Instagram Achieves Real Time Photo Sharing

[System Design Tech Case Study Pulse #20] Serving 1 Trillion Edges in Social Graph with 1ms Read Times : How Facebook TAO works

[System Design Tech Case Study Pulse #9] Facebook’s News Feed Algorithm Marvel: How it Serves 2.9 Billion Daily Active Users Using PyTorch and Cassandra

[System Design Tech Case Study Pulse #2] How Lyft Handles 2x Traffic Spikes during Peak Hours with Auto scaling Infrastructure..

[System Design Tech Case Study Pulse #12] 8+ Billion Daily Views: How Facebook’s Live Video Ranking Algorithm Processes Daily Views Using Storm and Memcache

[System Design Tech Case Study Pulse #18] Tinder 1.5 Billion Swipes per Day : How Tinder Real Time Matching Actually Works

[System Design Tech Case Study Pulse #17] How Discord’s Real-Time Chat Scales to 200+ Million Users

[System Design Tech Case Study Pulse #13] 20 Billion Messages Daily: How Facebook Messenger Actually Works

[System Design Tech Case Study Pulse #16] How Facebook’s Live Video Ranking Algorithm Works to Process 8+ Billion Daily Views

[System Design Tech Case Study Pulse #16] 100+ Million Requests per Second : How Amazon Shopping Cart Actually Works

[System Design Tech Case Study Pulse #11] Serving 132+ Million Users : Scaling for Global Transit Real Time Ride Sharing Market at Uber

[System Design Tech Case Study Pulse #8] Processing 2 Billion Daily Queries Using Unicorn and Apache Thrift : Facebook’s Graph Search Secret Sauce

[Tuesday Engineering Bytes] How Netflix handles millions of memberships efficiently?

[Friday Engineering Bytes] The Billion-Dollar Question — What’s My ETA? How Uber Calculates ETA…

[Saturday Engineering Bytes] What happens Once You Press Play button on Netflix..

[Monday Engineering Bytes] FAANG level — How to Write Production Ready Code ?

[Tuesday Engineering Bytes] Impressive Tech Behind YouTube’s Scaling to 3 Billion Users: Inside Vitess Database Clustering

[Thursday Engineering Bytes] Binge-worthy Backend: Netflix’s 99.99% Uptime Serves 230Million Users Globally!

[Friday Engineering Bytes] At Amazon How 310 Million Users Experience Lightning-Fast Load Times

[Monday System Design Bytes] — How to solve any system design question ( with detailed case study example)

[Tuesday Engineering Bytes] How PayPal Manages Over 400 Million Active Accounts Seamlessly?


How it works —

1. User submits a search query through the Facebook App or Website.

This post is for paid subscribers

Already a paid subscriber? Sign in
© 2025 Naina
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share