HomeBlogsDesigning a Smart Search System with Lazy Loading
Engineering & Architecture

Designing a Smart Search System with Lazy Loading

We worked on enhancing an existing holiday booking platform that provides:  Flight search   Property listings   Complete holiday packages   All data is integrated from a third-party system — iVector  While the platform was functionally complete, it faced significant performance challenges in production.    The Challenge  The existing system (built using PHP) followed a blocking architecture:  All supplier APIs were triggered   […]

We worked on enhancing an existing holiday booking platform that provides: 

  • Flight search  
  • Property listings  
  • Complete holiday packages  

All data is integrated from a third-party system — iVector 

While the platform was functionally complete, it faced significant performance challenges in production. 

 

The Challenge 

The existing system (built using PHP) followed a blocking architecture: 

  • All supplier APIs were triggered  
  • The system waited for every response  
  • A complete dataset was returned only after all calls finished  

Impact: 

  • Average response time: ~23 seconds  
  • Poor user experience due to long wait times  
  • Increased API timeouts from external suppliers  
  • Limited scalability under higher traffic  

 

Our Approach 

Instead of applying incremental fixes, we re-architected the search system to improve both performance and scalability. 

 

Key Enhancements 

  1. Lazy Loading Implementation

We introduced a lazy loading mechanism where: 

  • Initial results are returned immediately  
  • Remaining data is fetched asynchronously  
  • Results are progressively updated  

This ensured users could interact with the system without waiting for full data completion. 

 

  1. Parallel API Execution

SuppliermAPIs were redesigned to execute in parallel rather than sequentially. 

Benefits: 

  • Reduced dependency on the slowest API  
  • Improved overall response time  

 

  1. Intelligent Caching with Redis

We integrated Redis to optimize repeated data access. 

  • Search results stored using unique identifiers  
  • Incremental updates as supplier responses arrive  
  • Faster retrieval for pagination and repeat queries  

 

  1. Incremental Data Processing

Instead of waiting for complete data: 

  • Each supplier response is processed independently  
  • Results are merged dynamically  
  • Users receive continuously improving result sets  

 

  1. Request Status Management

We introduced request lifecycle tracking: 

  • loading → data is still being fetched  
  • completed → all sources have responded  

This improved both frontend handling and user experience. 

 

Results 

The redesigned system delivered a significant improvement across multiple areas. The average response time was reduced from approximately 23 seconds in the existing system to around 4 seconds after the redesign. Data handling was transformed from a fully blocking approach to a progressive, lazy-loaded model, allowing users to see results much earlier. API execution was optimized by shifting from sequential processing to parallel execution, eliminating delays caused by slower external services. Additionally, caching, which was previously not implemented, is now handled using Redis, enabling faster data retrieval and improved overall performance. 

 

Key Outcomes 

  • Faster initial response for users  
  • Improved system scalability  
  • Reduced dependency on external API response times  
  • Enhanced user experience with progressive data loading  

 

Conclusion 

By redesigning the search architecture with lazy loading, parallel processing, and intelligent caching, we transformed a slow, blocking system into a fast and scalable solution. 

Performance improvements are not always about optimizing code —
sometimes, they require rethinking the entire flow. 

Share Article

Need Expert Help?

Have a project in mind? Let's discuss how we can bring your vision to life.

Contact Us